(Linkedin 7-Oct-2021) Toward a self-regulating techoconomy
This is not a ‘social post,’ nor incendiary anti-big-corp vitriol, nor a plea for big-gov-regulatory-oversight. No, this could be a business idea, an open-source project, non-profit or all the above. My goal is to engage the technoconomy to find a way to ‘self regulate.’
Before there was Facebook whistleblower Frances Haugen, there was Tristan Harris. Before there was NSA whistleblower Edward Snowden, there was Bill Binney, and before him Perry Fellwock. Often, it takes more than one whistleblower to raise awareness. I applaud Ms. Haugen’s efforts to raise awareness of these Facebook issues. These revelations shed more light on what Tristan Harris and others previously spoke of in TED Talks and the documentary: The Social Dilemma.
There is a misalignment of the “altruistic mission” of various social media enterprises and their revenue models. We can define these revenue models with such terms as:
But I offer that this problem is not limited to the Facebook’s and Twitters of the world. Google is at the center of it as well. Some might even say that Google pioneered it. The major and minor news outlets around the globe also increasingly use this model.
All these industries employ “continuous digital engagement” (read: techniques to keep you scrolling and hyperlinking for hours) This increases their advertising and affiliated revenues. But “continuous digital engagement” means using the kinds of cognitive hacking that Tristan first brought to light in April 2017.
What Ms. Haugen has displayed is the depth of understanding that these companies have of the problems, and their unwillingness to solve them. I respect that you cannot walk into HQ some Monday morning and announce that you are going to radically change your revenue model — for a loss. This would devastate thousands of employees incomes and tank the stock price. A more pragmatic approach is to develop a new revenue stream that separates from the problem. A new stream that is more profitable than the current one, but does not depend on the shenanigans (sorry for the technical term) Ms. Haugen has shared. I am not seeing much evidence on the part of Facebook (or Twitter etc.) to do this kind of courageous pivot.
What, then, is the solution?
A False or Partial Solution:
The prevailing thinking seems to suggest legislated regulations that prevent such behavior. Okay, let’s say that is the right answer. Let’s even pretend that enough of both parties in both houses of Congress can actually agree on something. Let’s say they pass a law that mandates that all this code has to change. Keep in mind, the code only actualizes the revenue model which is the real source of the issue, but very hard to ‘legislate away.’ So, how would the government actualize such law(s)? More critically, how would they enforce them?
Enforcement is the focus of this article, and to which we now turn our attention.
Legislators are not often mathematicians, computer/data/cognitive scientists, or programmers. The kinds of algorithms that Harris or Haugen talk about consist of thousands to millions of lines of program code. That code involves computational methods, linear algebra, statistics, machine learning (AI) and a lot more. It seems unlikely that the legislators will be the enforcers of their laws.
Many decades ago, some similar concerns emerged about abuses by the railroads, early energy companies, and Wall Street. Congress also grew concerned about certain biology-based companies (pharmaceuticals, etc), chemistry-based (pharma, plastics, etc.) and physics-based (nuclear power). (See a pattern?) So, regulations followed for each. To serve this regulatory demand, new disciplines emerged like financial auditing and independent scientific commissions (IAEA). New armies of trained and skilled practitioners would pour over mountains of financial and laboratory records, perform scientific tests and more to ensure that there were no shenanigans (again, sorry for the technical term) going on that violated laws/regulation. Simply put, these non-scientific legislators had to make armies of scientific practitioners to act on their behalf. Software systems today are increasingly based on the sciences of mathematics, cognition, psychology as well as computation and information theory.
Congress could pass a law telling Facebook or Google or any myriad of online services companies to STOP doing this or that shenanigan thing. Those companies would nod their heads in collective compliance and ostensibly instruct their own armies of the very best and brightest in the world — to change all those millions of lines of code to bring their software into compliance. Could anyone reading this right now be able to show/prove that they had complied? How? My strong opinion, reasonably strongly held, is that this would be exceedingly difficult for regulators to accomplish, and very easy for these companies to obfuscate (read: disobey) without anyone being able to prove otherwise.
This is because of one and only one reason: brain trusts. Imagine you are a genius level math and computer science graduate with two options:
- go to work for a globally recognized sexy company offering a six-figure salary, free stock options, free lunches and a ping-pong table in your break room — AND — the opportunity to do really challenging and creative work; or
- go to work for some government agency making way less than a six-figure salary, with no stock options, no free lunches, no ping-pong table — AND — mind-numbing administrivia work plowing through millions of lines of code looking for that one subroutine that violates some paragraph and subsection of regulation.
Which would you choose?
Ergo, my money would be on the companies that will make a very good showing of compliance, saying all the right things, but not doing (m)any of them. And how would anyone be able to prove otherwise? This is the classic problem of unenforceable legislation/regulation.
How, then, are we to solve this problem?
An Alternative Partial Solution:
One alternative (there are more) approach might be to require all such companies to submit their code and documentation to a third-party escrow and “auditing” agent (not the government). These companies, who are paid by their clients (the company under audit), to prove their compliance. This is exactly how it works today for financial audits. These companies could also be “bonused” by the US Dept of Commerce or Justice or whatever appropriate umbrella department — upon discovery of violations — that incentivizes them to find shenanigans. And that government agency funds that effort through the fines they levy on the regulatory violators.
This is actually not a new model. It is borrowed from various other industries including healthcare, law enforcement and the financial sector.
In this model, those genius math and computer science graduates might still have the choice #1 shown above, but their choice #2 is a different kind of private company — offering all the same benefits. Now you have a fighting chance at hiring the talent you need to figure out who is compliant and who isn’t.
Silicon Valley is one of the greatest concentrations of really smart, creative and highly invested communities in all of history. And the “impenetrable boxes” they are building are just as opaque to legislators and most of the rest of us as organic chemistry, nuclear power or, soon enough another battlefront: genetic editing. To balance the scales, we need a different model that is properly structured and incentivized to actually work.
The REAL Solution:
Still, at the end of the day, that ‘solution’ really only helps to solve the enforcement problem assuming that legislation and regulation end up becoming reality. It does not solve the true problem: we need a new economic model to fund this ‘techonomy’. One that does not depend on ‘attention,’ ‘persuasion,’ ‘manipulating behavior,’ ‘surveillance’ or other ‘shenanigans.’
Here’s a simple thought: what if Facebook did away with all those shenanigans and just charged a nominal monthly fee for their service? If my local coffeeshop makes me a cappuccino, I pay my $3 and everyone is happy. If all 2.7 billion of their “customers” paid $3/month, Facebook would not miss a beat in revenues, but would eliminate the need for the shenanigans and could get back to focusing on their stated altruistic mission. Or perhaps one of you entrepreneurs out there — can build an alternative to Facebook with just such a revenue model.
What are your thoughts? Please tell me, I’m interested.