Sat Feb 06 21
In the 1980s we had what I can call the perfect software security ecosystem. Back then, the computer was able to determine - by light of the fact that it was in my physical house - that it belonged to me, and the authentication story was that if I gave the computer a steady flow of electricity, I got access to every feature of the operating system. Comparatively today the process of authentication looks somewhat different. "But what about save files?" you might reasonably ask. Well games back then were entirely stateless. There was no such thing as saving data because there was no data to save. Requiring skill, luck and ingenuity, the player was asked to sit through the entire challenge of the game in every sitting. This lack of convenience leading to the perception that 'Games were harder back in the day...'. Some games gave you a feature to input a data-encoded password to force the game to load into a certain state to seemingly 'continue' where you left off. For the most part though they were stateless.
Passwords are a simple and elegant solution to the multi-user problem. To prevent User 1 accessing User 2's data, you need a phrase that only User 1 knows and the computer will load into the state that User 1 is familiar with. This includes User 1's favourite desktop theme, their icon layout and most importantly their game save files.
Because simple solutions like "It's in my house therefore it's mine" and "I know the password therefore it's me" would invalidate the jobs of most software architects, they took a long hard look at these multi-decade paradigms and decided to change it in the last five years. In comes "2Fa" where you need to know the password, the email account and have access to a "linked device" that you're able to authenticate to. This balances an 'acceptable' level of user frustration for the 'added protection' they feel as they're mistyping their 2FA code into their onscreen keyboard and fat-fingering it over and over again.
Of course, now that this is industry standard practice we're wanting to change it again - as software architects. It's been decided (by people who aren't you) that the software you use (you don't own it, so you get no say) will now authenticate you by you knowing your password, being able to open the email address of the account you're accessing, having access to the linked device and smiling for that device's camera so that the machine can compare your facial features with its recorded likeness of you. If those three factors pass, then the software may allow you access to the account you're allowed to use. Thank God for safety.
As a software Architect who is seeing access to the authentication dynamics quickly reaching maturity. I feel the urge to
save my job give the users an extra level of security. Because you see, people simply don't feel secure with their current level of access and knowledge. It's not enough to have a linked social profile, email address, secondary device, owning a machine-recognized similar face and knowing the secret password that only exists in the users brain. The possibility of being hacked is too damn high! All over the internet and across the world users are breaking down the virtual doors of my second-life house to beg "We need more account security!" And so, with a heavy heart and a furrowed, determined brow I get to work...
Introducing 12 factor Authentication for the modern 12 factor app. It's well known by science that the more factors your app has, the more circles of
hell auth you have to get through to get access to the limited functionality the software decides to let you have, based on your access level. To authenticate with a 12 factor application we will be requiring the users to have the following
1. The email address associated with the user
2. A linked device that the company is aware of
3. Location services and Camera enabled on linked device, so the company is aware of the user's current face is recognizably similar to the on-file face and the user's location is within acceptable parameters
4. A linked social account from a list of companies large enough to be considered 'trustworthy'. Being a US Military contractor is a minimum bar of entry
5. A pass-nursery-rhyme: Requiring an 8 character password was considered secure in the mid 90s, but some time in the 10s that was changed to "passphrase" because they're harder to brute force. Whats even better than that? Training the user to invent its own nursery rhyme that it tells the computer. This way we can programmatically assess the users' literary style with the onfile literary style
6. Full biometrics: fingerprinting, licking the secondary device to pass and analyze DNA, and of course giving it a blood sample on first login.
7. Endorsement from other users, popup notifications appearing for users who are logged into the app, who are asked "Do you want to allow CuteCat123 access to their account?" with a yes/no dialog. In order to ensure the user is giving us their real intention, yes and no will be positionally switched and reverse color coded.
8. A linked tiktok video of the user doing a wiggle dance pointing at things and ending with "I love the company"
9. Complete medical history and personal medical questions being repeated
10. A video interview in business formals between the user and a company representative
11. "Always on" connectivity to the three mega-servers that are geographically distributed, fully redundant and who separately authenticate both device MAC addresses. These servers are colloquially known as the "three wise men" Casper, Melchior and Balthazar.
12. A company representative is contacted and then asked to manually review all the data and that representative will then allow the user access to the small part of the software that the company approves of
Only then will users finally feel safe to access
their data the company's data on them.