The Internet is arguably the most important piece of technology ever invented. Its positive impact on the world in such a short time can never be overstated. While it has never been without its challenges (can you think of a new technology which has?), it remains an overwhelming force for good. But a number of recent events have led to more serious questions being asked about the integrity of the Internet, and how it could be improved.

Concerns about ‘fake news’ and its potential influence over major political decisions, the ever-broadening scope of cyberbullying across a growing number of platforms, and the potential impact of social media on an individual’s self-image, have all led many to question the Internet’s design. Then, there is the question of the power wielded by the handful of companies that are responsible for delivering the Internet’s most used services – and what they do with the tremendous amounts of personal data they hold on each and every one of us.

We may look back on this time as a period of reflection for the Internet. Genuine questions are being raised about how is best to manage the Internet’s challenges, and whether the Internet itself needs to be redesigned. But what would that look like?

There are two ways to address the challenges of the Internet – we either reform the technology of the Internet itself (i.e. reset the Internet), or we find a way to regulate what we already have.

Most likely we will need a combination of both approaches, so let’s look at both approaches in turn.

Redesign the Internet

Networking, Internet
– Thinkstock

If we took today’s most advanced technologies and attempted to build something new – something which retains all the benefits of the Internet while avoiding all of the drawbacks – what would that look like?

Some alternatives are already being proposed. The ICN (Information-Centric Network) proposed by InterDigital is one example. The advantage of the ICN is the elimination of the client-server topology which is responsible for much of the latency and duplication of data experienced across the Internet.

Under the client-server structure, information exists somewhere (on a server) and clients (us) must go to that place to access it. Client-server made sense when information was scarce and only located in a small number of places, but today information is being created by all of us at an exponential rate, and data is growing far more quickly at the edge of the network than at the core.

An ICN-based Internet works with this in mind. It would do away with URLs (Uniform Resource Locators) which tell us where on the network the information is and swap them for URIs (Uniform Resource Identifiers), which tells us what the information is. The contrast here is that when you want a piece of information, you leave it to the network to find it. It will more likely be much closer to you than a remote server somewhere.

The advantages of the ICN is a reduction in latency – since data would be accessed from a location much closer to the user – but it can also improve trust because it removes the ability to use fake URLs, a common tactic for deceiving users with fake websites used for phishing attacks or distributing fake news. These are two very significant improvements.

Self-regulation vs. government influence

It’s hard to regulate something that isn’t centrally controlled, but how would you go about regulating the Internet? Other than ICANN (Internet Corporation for Assigned Names and Numbers) there are no global ‘Internet authorities’ who are solely responsible for our Internet experience. There is nothing an individual government can do to meaningfully influence the Internet, and even those countries that attempt to impose some levels of control or censorship can only do so much.

If individual governments cannot implement a meaningful solution, perhaps we need a global approach. Could a globally-endorsed treaty for the Internet be the solution, whereby every country agrees to pursue a common Internet agenda? A Paris Agreement for the Internet, if you will. While such an agreement would be a laudable achievement, I suspect it would be nigh on impossible to achieve such a technical level of agreement among all 193 UN member countries that could make any meaningful impact.

However, while it is hard for governments to regulate the Internet, this does not mean those who operate or designed the Internet aren’t looking at their own solutions. Sir Tim Berners-Lee, the inventor of the World Wide Web, leads Solid, an MIT project that proposes decoupling applications from the data they produce. This is a form of self-regulation and adaptation of the Internet.

Solid was founded in response to the growing hegemony of the big Internet players. The major platforms like Facebook and Google have leveraged the open nature of the Internet to become its oligopolies. They have become so large that their influence cannot be overlooked. Facebook, for example, now has over 2 billion active users – it is effectively the filter by which nearly two thirds of the world’s Internet users access the Internet. These platforms control much of what is done on the Internet and have their platforms are accessories to the widening ‘fake news’ problem.

The ambition of Solid is to self-regulate the Internet by changing the way data is handled. Today, most Internet companies require you to hand over your data before you use their services. For example, every picture you post on Facebook belongs to Facebook because they’re the ones who store it. By contrast, an application built on the Solid infrastructure asks users where they want to store their data – with the application requesting access to it. The crucial difference is that data remains in the ownership of the individual, not the application. While you may decide to store your data on Dropbox, it remains always under your control, and you can prevent the application from accessing it at any time.

Watch this space

Solid and ICN are just two examples of self-regulation and technical changes that can be made to the Internet to reform it. The heartening point is that those responsible for the Internet are also the ones looking to improve it. Through the spirit of openness and collaboration – principles that are so core to the Internet itself – I am confident that such solutions will be delivered far more quickly and effectively than any government-led approach.

It is therefore not a question of regulation vs. re-design, or self-regulation vs. government interference. The Internet will continue to regulate and re-design itself. It has never stopped evolving to address its challenges. By continuing to do so, it will find its own solutions.

Simon Yeoman is general manager of Fasthosts, a provider of Internet access and hosting services based in the UK