P2P Standards - There Are Many, Therefore, There Are None.

The consequence is, being of no party, I shall offend all parties: never mind! ~ Lord Byron, Don Juan: Canto 09.

I've written of it before, I think. I have a preoccupation with how things should be, rather than with how they are. I'm lucky enough to live in a nation where this is (mostly) accptable and not one where it might be punishable by death. I'm a socialist/anarchist, like my trade unionist grandfather, except he had to identify as "a fellow traveller" and very carefully, at that. I get to live freely in my country, holding these beliefs openly, if rather disappointed that my country is not a communal collective of communal collectives all the way down. I see housing as a human right, yet so many of my, now quite, mature age friends, especially women, are one car crash, one big bill from potential eviction. I believe passionately that the internet should be decentralised and serverless, because that's what DARPA set out to create, a decentralised, literally bomb-proof, terrestrial, communications system. The internet was originally conceived as a decentralised infrastructure. Meh, life is full of disappointments, yet I am as happy and as passionate as I was at 18. A little bounded rationalism is a useful thing. Being, frankly, bourgeios as fuck in life probably helps, too.

Still, despite being secure as I approach old age, there are things I want to achieve, and changes I want to see, and I've given a LOT of the last 5 years of my retirement to trying to see them happen. One project in particular:- a decentralised, serverless, peer-to-peer internet that feels like Web 2, or even Web 1! As in just there, easy to use, easy to have your own space, but nobody mining your private data! In the Australia of the early 1990s, the Facebook business model would have been declared illegal! "Here, talk about your most private things, with only your trusted confidentes... thinks: while we eavesdrop, interpret your reactions and sell tools for business to manipulate you, even manipulate election results with your data." I'm writing this on Google's Blogger. Irony? Much. Just as bad, they're just "not evil" about it. Yeah, sure. But there are problems with peer-to-peer, too. Big ones, as I see it.

Now it gets technical, sorry, but hey, the whinge about privacy and barely legal privacy theft is over.

How the internet was conceived, decentralised and bomb-proof, is not how the internet is today. It's "client/server", almost down to bedrock, and we're not the clients (we are in network engineering terms, but... hey), we're the data cows being milked for our knee jerks wherever we surf! There are teams working on peer-to-peer systems to rectify this, I2P (misidentified by the media as "The Dark Web"), IpFS (Interplanetary File System) and BitTorrent (misidentified by the media as an "illegal file sharing network") to name just 3 of oh, so many. Sadly, they're not as easy to use "the web." Then there are all the smart devices that have dumbed us down. Most apps these days have between, maybe, 3 to 12 buttons and half as many associated points of data entry. These apps also centralise our access to information and services, even further than the client/server web model. The digital serfdom of Hayek and Varoufakis, et al, is strong on this planet! Yet peer-to-peer and data privacy just isn't being adopted by anybody other than full on nerds or, worse, the "tinfoil hats" and "cookers."

The reason isn't just new tech often being difficult, most of the apps using IPFS as their infrastructure are really quite mature, despite IPFS tryin to be an IETF recognised protocol being incomplete. There are several reasons, as I see it...

  • Humans don't, as a rule adopt early unless something is...
    • Easy to use,
    • Really exciting or,
    • All their friends are already there,
  • There's a strong personal reason to change, such as work or social necessities, (Zoom in the pandemic)
  • There's a mainstream trust and acceptance of the new thing, it just works, say, like the telephone, or email
  • And the stupidest reason of all, "fashion." (Temu, anybody? Hotmail in the 90s? Really?)

Sadly, peer-to-peer networking, while serving the needs of some niche pockets like "crypto bros" (as in "investors", not cryptography geeks), freedom fighters or coders, just isn't taking off. I have tried several different systems, and I can't get people to switch. In this case, one of my daughters and a close family friend, a social network doth not make. And that one is still more client/service than true peer-to-peer, it's just free and works. Nobody thinks they have a reason to switch. Privacy and security is a damned good reason. Living in an unmonitored, unmanipulated society is damned good reason to switch. A single, official standard, like email has an RFC document from the IETF (RFC 3501, for example), might help, but only a little.

So, here we stand. A world that needs a true, peer-to-peer network, but no universally accessible protocol standard to cover all platforms, while those who could create this kind of network structure are all competing against each other, hoping to get "tech bro" rich, all the while reinventing each other's wheel. And the RFC don't seem to care, really. Why should they? They'll wait until a superior standard emerges as supreme, they're standards body.

Humans! What we have is what the world is. Frustrating.

So, is there a solution? There is, and history has a model, the 1980s. The personal computer had exploded on the scene. Platforms proliferated, even within single companies! (Apple had the II, the Lisa and, shortly after Lisa, the Mac. All of these overlapped) and the platform that dominated by the end of the 80s was the IBM compatible, running DOS. It became a defacto standard. As a Mac fanboi, I hated that a major corporation "owned" a standard for the platform and another owned a "standard" for the OS, but history chose the interoperable system that "worked" and was accessible. So, while hardware has finally become irrelevant (to an extent), even Microsoft has conceded POSIX is the standard for all OSes. (Linux Subsystem for Windows.) Peer-to-peer developers need to be looking to this.

In that 80s PC history is another lesson, Visicalc. The spreadsheet was the killer app. What is peer-to-peer's killer app? Cook me in a frypan for saying it, my guess is not a social network. At least yet, anyway. I suspect it's a mashup of wikipedia (wikis in general, in fact), a C++ port of gun.js (a graph database in javascript) and git, the document version control software that's found in nearly every desktop computer's command line, if you know how to look for it. In fact, git's pretty close to the original peer-to-peer protocol and can work that way, if needed, with existing network transports, but it does stink of nerd.

Yeah, I'm "thinkin' out loud" here, but lets break this big reveal down. Thanks for your patience and for reading this far, too, BTW. It's appreciated.

The first part of my claim, that a decentralised internet has a wiki frontend, specifically, a wikipedia-like frontend seems obvious to me. It's the "Everything, everywhere, all at once" of the internet. It's the important, the profain, the intersting and the niche. It's an ideal that appeals to the centerist, leftis and the right-wing cooker, all at once. My version, if I possessed the skills to create it, would be a little more tightly designed. This type of user frontend on a peer-to-peer application would need to be a bit less idealistic (sorry Jimmy, but you know you were, mate), so I propose a 2 tier user system. Ordinary users would still be able to create articles and publish them to the world, but the articles would have no "weight". Weight in this context is an upvote variable, and every citizen-initiated article would start with a weight of zero. Every citizen of no recognised expertise in their field would have a vote-weight of 1.

Academics, engineers, researchers, technical writers, etc would be able to include a cryptography key from the faculty, institute or employer where they work (more work for the IT department, sorry guys) that would give them a weight of, say 8. Maybe a bit higher, certainly 10 as a maximum, if they have higher status in their field. Outside of their field, their weight would be 1, like the rest of us, because their hash would only be for what they are expert in at their institution. It would be possible to include several hashes, some experts have several fields, or at least sub-fields. This system has the one thing Facebook ever got right, NO DOWN-VOTING!!!

So, my political scientist daughter, about to be a published author globally, would automatically have an initial weight of, say 8, on any article on cultural diplomacy, her field, because she would be able to get a hash key from her faculty that says that is her field. Me, writing on audio design, even though that's my field, my articles would start at 0, because I have no formal qualifications and no longer work in the audio production (broadcast) sector. Article topics and categories would be pretty much ad hoc, but the system would have a preprogrammed styleguide in the app that would guide category selections based on article keywords. The article I write on the "audio design, how to set a level" would reside only on my device. It goes nowhere unless somebody sees it in a search and selects it for download.

Now, people search within a super-topic, or child sub-topics, for articles on this distributed wiki app and they are given a list of articles, ordered by weight, then date. My article on setting audio levels has been in the system since the aerly days, it's spot on, and has many upvotes (I like to imagine even an upvote from Steve Albini), so it now has a very high weight, so despite being older, it comes up near the top, but not the very top. People keep finding it a helpful article on how to record sound with a technically proficient signal strength in a wide variety of situations, so it keeps "floating" until every human knows how to do it, or somebody writes a better one, so it finally begins sinking. Probably Steve Albini, frankly. Any apps that get read, get cached with their unique hash on the readers' devices, so they're never lost, even if I lose my phone or laptop in a house fire.

"FANTASY!" you cry! "You can't distribute files on a network like that without a server!" You can. That's where C++ version of Gun.js comes in. As I mentioned above, Gun.js is a javascript graph database. It's local first, always, which means the file always belongs to its author and, while readers have copies, those copies can be made local if somebody wants to persist them. They can be lost forever if not persisted when seen by a reader, but the author can unpublish them. Or recall them on a new device if their old device was in their burning house. The downside of Gun.js for network infrastructure is that it's javascript.

Gun can't be compiled into a system extenstion to provide a network service for everything on that system, it can only run in the browser (be that browser an app or Chrome) and can only be served from a node in an app. (Or Chrome.) Hence, the tool needs to be a C++ extension library, because most system software is written in C++. (Mac is written in Object C and Swift, so ports to those would help, as well as to emerging system languages like Go and Rust.) Anyways, Gun.js is amazing, check it out at https://gun.eco/ The use of a graph database, that is local first, makes peer-to-peer file storage and retrieval more than possible, it makes fantasy a reality. It's literally the lightest weight, distributed file system out there, and it's pitched as a database. Good god, y'all! Gun, done right (in C++) is our network hash table, our file system, our upvote tracking

The third prong to this crazy fork is git. Git is a version management tool. It's literally the guts of the centralised software repository service, https://github.com/ and it can also run on local machines as a server. It's where those pesky academic and technical elites come in again, not to mention authors who collaberate on articles, as well. Articles created by authors with an academic or technical hash key need version control, not just a database to store them, retrieve them, edit and delete them. And academic knowledge needs a formal maintainer, just like professional code does.

This is what Webs 1, 2 and 3 don't have, the editor-in-chief! This is how we prevent "wiki vandalism" in this killer app idea! Even nobodys like me are maintainers of our tiny little audio articles. I can edit, you can only raise an issue or fork, then pull-request. I approve (or not) the edit before it goes live. Unless you're Steve Albini, Steve gets to edit my article because he has a recognised expert hash key in the field and I don't. My shoulders are broad, I was a professional in this field, once. I welcome editorial intervention, especially from a god in the field like Mr Albini. We all should. (I editted sound in a newsroom, editorial control is the most important human invention ever!)

Now, wikis, graph databases and git aren't entirely compatible in this kind of model. They don't just literally sit in a circle holding hands. Does "gun.cpp" wrap around git, or does git wrap around the database network? I don't know. Arduino is my most current programming experience. I'm learning c++ to systems and applications level as a retirement project. I've been following programming as a topic since I was in high school in the 70s, bit only as a side hobby. I guess what I'm trying to say is, I can't do this project, I need more than help, I need a hacker collective around me and I'll bring the tea and scones, the encouragement and hugs, as well as ideas.

What I do know is this: this isn't web 3. This is web 4 and the killer app, all rolled into one! This is a system level protocol in the making. A walkup start for an RFC document. As breakthrough to all the web as as the personal computer was to business in the 80s. So, I'll leave this here in this early web 2 corner of the universe and, if you think you can tie these 3 systems together, or part of it, but know others who can help, I'm here to bring ideas, resolve disputes and bring tea and scones, if you're in Melbourne Australia, at least.

But wait... [pause] [audience whoops] ...THERE IS MORE! [rampant applause]
What if hardware were also inherently peer-to-peer? There are better technologies than ethernet, wifi and mobile hotspots. What of this project, in parallel to creating a network infrastructure that connects over tcp/ip, was also running a team building a carrierless network. No mobile plans, no broadband plans? You see, this is the space I play in, and I do mean play, I'm not a professional, I'm retired. Microcontrollers and the internet of things. What if there were a dongle that could work in unlicenced space, that found peers and exchanged connection keys via LoRa radios, and connected as network links via "high power" and intermediary wifi side channels? A true, world-wide web of personal radio links. No phone, no ISP, no middleman selling services.

Can this be done in a cluttered spectrum? This would be the true, world-wide web! Look up the unPhone. What if that had more range than standard wifi? What if wifi devices could be war walked by freenet evangalists, on a crusade to cut out the middleman! That would be web 5!

No, I am not taking drugs. This is deadly serious! Steal these ideas if you must!


  1. Captain's log (supplemental), it seems git is already a potential, offline-first, CRDT system - as per this Medium article...

    The proof code they link to is still javascript, but it's also bundled with a tutorial, so a more robust systems grade app could be built in C/C++, Go or Rust, too, and git is available as a library, to natively include its features in an app. From my perspective, losing gun.js from this idea makes the potential of a project even stronger! No javascript to convert to C++.


Post a Comment

Popular posts from this blog

Because I'm That Kind of Crazy

Meanwhile, Developing a MIDI, Tap-Tempo, Master-Clock Pedal...

Crosspost of First Post From New Riding Blog, “Diary of an eConvert."