Image is illustrative. Pete Linforth (Pixabay)

The internet ecosystem

The internet is a network of networks. What does that mean, though? Few people understand the complex nature of the internet and how it is run. This text tries to give a glimpse at the underlying processes that ensure and regulate the availability of web content.

Technical basics – TCP/IP

The internet runs on TCP/IP. TCP/IP is a set of communication protocols that structure how data is transported from one place to another. It contains information such as where the data is going, how the data is routed on its way, and how it should be packed, unpacked and delivered (Stevens, 1993).

IP means Internet Protocol. It allows networking by making the different devices individually identifiable. In a network, each device has an assigned IP address, which is a 32bit number code. In a regular household network, for example, there are IP addresses assigned in the home network to allow communication between the devices. In addition, the router has an assigned IP address through the provider that allows for connection to the internet. Hence, when setting up a Wi-Fi network at home, it might happen that you can connect to the network but do not have internet access (Heer, Garcia-Morchon, Hummen,Keoh, Kumar, & Wehrle 2011).

TCP is the Transmission Control Protocol, which essentially checks the integrity of the data. It makes sure that the data is not lost or modified on the way.

Following the earlier thought from the home network to the internet, the router (modem) connects via the phone lines to the Internet Service Provider (ISP). A mobile phone (when not using Wi-Fi) connects directly through its phone network to the ISP. This connection might be mediated through the distribution channels of the ISP. The ISP connects to the next bigger level, which in most cases will be the big internet backbone servers. These are figuratively the backbones of the internet. They connect the different regions with an enormous data load.

In order to send a message to another user, a device has to pack all relevant information into the TCP/IP protocol and send it via the local Wi-Fi to the ISP, which channels the data to the backbone. From the TCP/IP information, it is routed towards another backbone network and then down to an ISP and another user’s network, where their application is able to receive the information (Escudero, 2003).

Internet governance

The technical structure of the internet does allow it to be international, wherever there are access points. How these interactions are structured, ensured and regulated is negotiated through the process of internet governance.

Internet governance rests on many shoulders: governments that can set up national legislation, private companies that run platforms too big to fail, the technical community setting protocols and standards, civil society standing up for users’ rights, academia adding a research perspective and international institutions trying to create dialogue between these groups. Hence, much of internet governance takes place as a multi- stakeholder process, meaning the different groups should discuss issues together.
One of the biggest processes in the field is the Internet Governance Forum (IGF), hosted by the UN and strictly multi-stakeholder. This process is meant to be a dialogue. In order to bring all groups to the table, the IGF does not create specific outputs but rather tries to facilitate discussions between the stakeholders so those with power to implement can do so based on informed decisions (Van Eeten, & Mueller 2013).

Another important space for internet governance are the so-called I* organisations, which mostly represent the technical community and ensure the technical infrastructures of the internet, such as agreements on the protocols and standards. They contain the 5 Regional Internet Registries and the International Corporation for Assigned Names and Numbers (ICANN), which basically deal with domain registry and all related issues. These structures ensure, for example, that people can type instead of having to remember the IP “addresses”.

Other members of the I* organisations include the Internet Engineering Taskforce (IETF), the Internet Architecture Board (IAB) and despite the name the World Wide Web Consortium (W3C). These groups deal with communication standards including common standards for the data flow of apps or regulation for the Internet of Things, such as the internal communication of all smart home devices.

Last but not least they include the Internet Society (ISOC), which on the one hand deals with the technological standards and on the other opens the doors for civil society in these processes and is dedicated to creating education on the topics throughout the world.

Resulting difficulties

With all of these structures in place there is still plenty to do. While the international nature of the internet has created a whole new sense of community, states still operate within national jurisdictions, at best regional groups like the EU. This quickly finds its limitations, when legislation is broken and needs to be enforced. An individual based in Germany, selling counterfeit goods from China on Facebook, via a website on a server in Brazil under a company based in the USA, creates a jurisdictional nightmare. While there are often agreements in place in terms of collaboration to identify the individual, they are placed in a network of often contradictory legislations. With sufficient technical knowledge on their side, it might be very difficult to identify the perpetrator.

One way this has been approached recently is through the concept of platform liability. To enforce hate speech legislation in Germany, Facebook has been made liable for what the users comment. In essence, Facebook is responsible for flagging and removing hate speech content or will receive hefty fines for non-compliance (Deutsche Welle 2019). However, this proposal is not unchallenged. The legislation puts responsibilities on Facebook that should lie with prosecutors and lets Facebook take decisions that should be taken by courts (Krempl 2017).

The new EU Copyright directive creates a similar issue, by making platforms responsible for copyright infringement on their sites (Directive (EU) 2019/790). The only way to realistically screen the amount of content being uploaded to larger platforms is through algorithms that can stop uploads once a copyright infringement is detected. In this case, traditional court decisions are even outsourced to algorithmic decision making, and only after a user contest would there be human involvement in checking the case.


Many large issues in internet governance remain unsolved and the processes can merely help to inform those who are willing to inform themselves. Pressing topics are often approached unilaterally and outside these processes. Some issues will only be solved on a global scale.

A major concern is the fragmentation of the internet. Many countries are beginning to create strong filter walls to limit what content can be seen and only allow those approved by intransparent government standards. Some countries do not subscribe to the standard of net neutrality, which should guarantee that all users can have the same access to the same internet with all their data being treated equally. If net neutrality is not in place, private networks can limit the access to parts of the internet or slow down traffic to certain websites, which also leads to a situation in which some parts of the internet might simply vanish for users (Drake, Cerf & Kleinwächter, 2016).

On a global scale, there is also the notion that the biggest companies in the world run large amounts of traffic and bind users to content within their own network. In the western part of the world, Amazon, Apple and Google are competing for users with massive resources and holistic offers for nearly every service. In the past, they actually banned each other’s services from their hardware devices. In the future, will all links to other networks vanish, depending on whose browser you use?

Sir Tim Berners-Lee, one of the founding fathers of the internet, launched the contract for the internet on 23 November 2019. The internet is broken and it needs everyone on board to fix it, he claims. The contract is supported by over 80 organisations and is suggesting 76 rules, which are also based, amongst other principles, on the Charter of Fundamental Rights of the European Union and the UN Universal Declaration of Human Rights. The contract is summarised in 9 principles:

  1.  Ensure everyone can connect to the internet
  2. Keep all of the internet available, all of the time
  3. Respect and protect people’s fundamental online privacy and data rights
  4. Make the internet affordable and accessible to everyone
  5. Respect and protect people’s privacy and personal data to build online trust
  6. Develop technologies that support the best in humanity and challenge the worst
  7. Be creators and collaborators on the Web
  8. Build strong communities that respect civil discourse and human dignity
  9. Fight for the Web

(World Wide Web Foundation, 2019)

Maybe this will set the tone for the internet for the next decade; one can always hope.


Martin Fischer
Martin Fischer

Martin Fischer is a media and game educator. Coming originally from a political education background he created the gameoverhate initiative and various digital engagement possibilities for young people with interest in Internet governance. Currently he runs project for vulnerable youth using digital games to enhance digital and social competences.