Big Data from Space: Satellite Images for Everyone via Open Telekom Cloud

Disclosure: Wadav is reader-supported. We may collect a share of sales or other compensation from the links on this page.
Updated: October 15, 2018
  • Share
  • Facebook
  • Twitter
  • Pinterest
Reading Time: 8 Min
Big Data from Space: Satellite Images for Everyone via Open Telekom Cloud


The Earth Observation Program Copernicus of the European Space Agency (ESA) is producing satellite imagery of the planet. Twenty terabytes of data come together every day. Data that ESA makes available to everyone via the Open Telekom Cloud.

Tractor, plow and combine harvester are no longer enough for farmers. If you want to successfully order your field today and produce it effectively and sustainably at the same time, you can rely on an alien tool: Sentinel-2A. The ESA satellite, which weighs around 1,000 kilograms, is part of the Copernicus earth observation program and flies around the planet in a polar orbit. Together with the Sentinel-2B, he forms a team that scans the earth's surface with a scanning width of 290 kilometers.


Big data instead of farmers rule

The satellite pair takes ten days to capture the entire earth at a resolution of 10 x 10 meters. Not only in the visible light spectrum, but also in the infrared range, for example, to capture plant vitality and photosynthetic activity. The pictures then show where in the fields due to nutrient deficits more fertilizer is needed or which areas need more water. Farmers see where more plants thrive and where there are low-growth areas.


The Sentinel 2 satellites detected in detail what is completely hidden from the human eye. The advantage for the farmer: With the information from the orbit he minimizes the risk of harvest losses and increases the economic efficiency.


Not only farmers are working with the Copernicus program data (box: "Copernicus - Europe's Guardians in Space"), but also shipping companies, insurance companies, and construction companies. The Copernicus Data and Information Access Services, or DIAS for short, makes the information in the Open Telekom Cloud available to anyone for free. The goal of the ESA: Everyone should be able to use the data from space simply - without time-consuming data transfers, without investing in their own IT infrastructure and without special know-how.


Copernicus - Europe's Guardian in Space

In 1998, the European Space Agency (ESA) and the European Commission laid the foundation for the EU Earth Observation Program Copernicus. In 2014, Copernicus started operations. Since then, satellites have been watching over the state of the planet's health. Using sensors and cameras, the satellites continuously provide images and information about landmasses, oceans and the atmosphere. Copernicus focuses on six core services:


  • Land Surveillance records changes on the surface and on inland waters. For example, the program supports urban planning, water management, forest monitoring, and agriculture.
  • Monitoring the marine environment keeps an eye on water quality and provides data for navigation at sea.
  • The focus is also on the atmosphere: the satellites analyze the composition of the gas envelope on our planet and evaluate the air quality.
  • The monitoring of climate change is a cross-sectional area fed by data from other areas.
  • Copernicus also contributes to disaster and crisis management. In the case of natural disasters, the extent of damage can be determined and ancillary needs derived.
  • The safety of the EU's external borders is also the focus of the satellites. So they support the border guard.


Satellite data for everyone

On behalf of ESA, T-Systems democratized the information from space to a certain extent: via the service Mundi Web Services, not only data but also prepared services are available for analysis. Because instead of having each user implement evaluation services for themselves, Mundi provides the solution for everyone: users and analysis offers are brought together in the Open Telekom Cloud. Exactly where the huge ESA data lie. After a transition and testing phase, Mundi Web Services launched in June 2018. Since then, the service is available in the Open Telekom Cloud, a public cloud offering based on the free cloud computing architecture OpenStack. The Bonn-based provider operates the Open Telekom Cloud in its own, highly secure and multi-certified data centers in Magdeburg and Biere.


Meeting in the cloud

The evaluation services offered by the Mundi Web Services Marketplace can be used to a certain extent like apps in the App Store. Instead of having to download data first, the services process the information directly in the cloud. This is practical, especially as here also the necessary computational and memory resources are available, which make complex applications of geo-informatics possible in the first place. Third parties are invited by Mundi to build their services and market them through the Marketplace. Over time, more and more specialized data services will be created, such as e-GEOS Grassland Monitoring, to keep an eye on the growth.


Economic storage of the data

All satellite data supplied by the Copernicus program is stored in the Object-Based Storage (OBS) of the Open Telekom Cloud. In addition, Mundi complements information from many other sources, such as NASA databases or open-data directories. A middleware ensures economical data management. It allocates its memory space to each bit. In other words, depending on the access rate or timeliness, the middleware moves the satellite data into various storage areas, which are divided into three performance classes: standard for daily access, warm for monthly access and cold for only a few accesses per year. The costs vary depending on the storage area. The data is provided by Mundi Web Services free of charge. Those who use evaluation services pay for computing resources in the pay-per-use model.


A metadata service indexes all data in memory. For example, searchable image and data files can be easily searched via GPS positions and times. Information about the author of the recording is also included: the satellites have individual sensors, such as multi-spectral cameras, Doppler radars or radiometers. The metadata service captures all attributes and also complements in-situ data from earth measurement systems, such as meteorological sensors, weather balloons or sea buoys. Also, aggregated data packets should belong to the offer in the future. This can be, for example, specially prepared image data records, which then take users off work. Instead of having to prepare the information first.


Selection in the Marketplace

Where previously more than 100,000 users had downloaded the Copernicus data every day, which put a lot of strain on the networks, everything now runs on the virtual machines and servers of the Open Telekom Cloud. Instead of competing for transmission capacity, providers are now competing with their services on the Mundi Marketplace. In addition, the Open Telekom Cloud also provides all the necessary big data tools: For example, users rely on software as a service offerings such as Map Reduce to quickly process large amounts of data in parallel via a Hadoop cluster. In addition to virtual machines on Elastic Cloud servers, bare metal servers are available. And databases are also available as Platform-as-a-Service.


All Copernicus program data should be available over a long period of time. Over a period of four years, such an archive with a size of 40 petabytes and should be created - equivalent to 8 billion smartphone photos with 5 megabytes. Because not only the current, also the historical data provide valuable insights. So the farmer is taking better fertilizer measures today. And then, as it were, he can use the data tomorrow to compare it to his earnings and increase the harvest - thanks to Big Data from space.



Research with public cloud

The Copernicus program is not only a gigantic collaborative project but also a model of European cooperation in industrial development and space exploration. The European Commission manages the Copernicus program together with the member states, ESA, the European Organization for Meteorological Satellites (EUMETSAT), the European Center for Medium-Range Weather Forecasts (ECMWF), other EU agencies and Mercator Océan. About Copernicus DIAS companies, developers and the public access the data. This should not only create new insights but also create new jobs. The EU Commission expects more than 48,000 new jobs to be created from the data and its applications.


Public cloud offerings like the Open Telekom Cloud are also in demand elsewhere in science. Universities and research institutions can no longer economically meet the growing storage and computing needs with resources from their server rooms. For example, under the leadership of the European Nuclear Research Center CERN, Helix Nebula (Box: "Helix Nebula - The Cloud for Research") has been launched in the Open Telekom Cloud: a hybrid research cloud, providing scientists with cross-country computing and research Provides storage resources. Depending on their needs, researchers call up IT capacities.


Helix Nebula - The cloud for research

The need for storage and computing resources in research is growing continuously. Own data centers in the institutes can barely cover the demand. The European Nuclear Research Center CERN has therefore commissioned a hybrid cloud platform with other leading research institutes: Helix Nebula - The Science Cloud. T-Systems operates the research cloud on behalf of CERN and implements a high-performance and a multi-cloud solution. The technical basis for this is the Open Telekom Cloud. Researchers throughout Europe have the opportunity to combine the Open Telekom Cloud with their own IT resources to create a hybrid cloud. The advantage: Scientists can book computing, storage, network, security, and management services flexibly and quickly. All this not only with the highest level of data security and data protection but also with a guaranteed availability of at least 99.95 percent. For example, CERN relies on Helix Nebula to realize complex calculations in particle physics. At peak times, CERN uses around 1,000 virtual machines, 500 terabytes of storage space and a bandwidth of 10 gigabytes per second.