Data storage from space to Earth: 3 takeaways for the real world

Join today’s leading executives online at the Data Summit on March 9th. Register here.

This article was contributed by Yaniv Iarovici, director of IoT Segment Marketing.

The space race is launching into new territory, and there’s no room for error. Initiatives can comprise expensive one-shot missions — such as NASA’s $10 billion James Webb Space Telescope — or range to small satellites, also known as ‘smallsats,’ which offer new ways to explore and capitalize on this burgeoning market. 

What the two have in common, however, is that any space initiative must be able to survive the extreme conditions and perils of outer space. A tremendous amount of R&D, planning and strategy go into every single piece of technology in these rocket ships and devices to ensure they surpass the unique requirements needed to field successful missions.

That research and development have had a unique impact on the technologies we use every day, as product development teams take those learnings and apply them to daily tech. These include Global Positioning Systems (GPS) and smartphone cameras based on CMOS sensors, both developed by the Jet Propulsion Laboratory in Pasadena.  

Following are three major takeaways that the real world can learn from developing technologies for the universe, which may hold the future to our existence and help improve the products we use on a daily basis. 

The pertinence of reliability 

Space has an absolutely unforgiving environment that’s not seen anywhere on Earth — and getting there (or making it back) presents equally difficult conditions for the technology enduring the extremes. 

On takeoff, electronic components endure violent beatings from extreme vibration, and once in orbit, every material needs to be able to endure wildly shifting thermal changes that can see a cycle through 260 degrees Fahrenheit (126.67 degrees Celsius) every hour of every day. Components also need to be able to endure space radiation, which threatens to degrade them until they cease to function – and survive through random space phenomena and ionizing particles that can spear through microchips like a hot knife through butter.

Even the James Webb Space Telescope — a technological innovation and renowned resource — still has 344 points of failure that can doom the mission at any time. 

What this all means is that reliability is mission-critical to space technology — and that reliability has been purposefully developed, proven, and integrated into the surrounding technologies. Product development teams across space organizations have shared some keys to reliability, resulting in materials deemed “space grade.” And for solid state memory, specifically, they are “radiation hard.”

From automotive parts that can stand the test of time to “simple” cookware designed to bear the repetitive stress of high heat to some common electronics in our day-to-day lives, countless products we interact with have benefited from learnings in reliability from space. Reliability has also influenced product design, which ensures every product is not only aesthetically pleasing, but remarkably functional as well.

Western Digital engineers have been working with companies in space on their approaches to data storage. Using an approach known as Design for Reliability (DFR) has become popularized as a standard engineering practice, which is intended to design reliability into products using state-of-the-art methods. As technology continues to advance and highly complex devices continue to shrink and miniaturize, DFR can ensure high-performance and low-voltage requirements so that new electronic components can overcome various limitations. DFR made leaps and bounds in space technology development, and the fruits of those labors are subtly penetrating more and more products.

Requirements for data integrity 

Everything in space, from rocket ships to satellites large and small, generates vast volumes of data. For example, according to HSAT, in 2020 there were 2,666 operational satellites in orbit. These satellites all capture thousands upon thousands of terabytes of data every day, which equates to petabytes every year. For context, 1 petabyte is 1,000 terabytes, and 1 terabyte is about 1,000 gigabytes, which is enough storage for roughly 250 feature-length movies. That is a ton of data.

And in the future, there will be even more data collected — NASA is planning two space missions called SWOT and NISAR that are expected to produce roughly 100 terabytes of data per day. Not all of this data can be sent back to earth in real time, nor should it, which means that effectively storing it in space is the only way of making this data useful and actionable.

To properly process and handle that data in space, to make that data useful, engineers have discovered acute requirements for data integrity to ensure that data retains the ability to be analyzed in space – or relayed back home for analysis.

While there are various definitions for data integrity, such as UBER — or uncorrectable bit error rate, which is a data integrity definition that also applies to enterprise applications — maintaining data integrity is critical because data corruption and loss can cause incorrect calculations — and accidents. 

Just as cars on earth use data in advanced driver-assistance systems (ADAS), rocket ships and satellites use similar systems that rely on data, and data integrity, to operate. As space technology grows to develop increasingly better systems of data integrity, those same systems gain better operations in intelligent application use cases around us, from AI-driven models in the enterprise to autonomous vehicles on the road.

Requirements for data storage in space

If data integrity is core to success in space, data storage is equally if not more important, for it’s the foundation on which data can be accessed, stored and analyzed. Data storage tends to be one of the most overlooked, however, most significant aspects of technology around us; how data is stored, processed and moved has a direct impact on the compute and analysis applied to it, which dictates its usefulness.

Data storage in space needs to be able to endure the rigors of a space mission, from the challenges mentioned earlier relating to launch, orbit, and return to earth, and the evolution in data storage reliability has led to an evolution of advanced data storage use cases around us. Some examples of this growth can be seen in places such as the transportation industry, with cars, buses and trains now leveraging advanced data storage technologies that essentially transform those vehicles into roaming data centers on wheels (or tracks). Data storage advances are driving countless industries forward – and that has sparked a virtuous cycle of advances in reliability, then advances in data storage, which continuously repeats.

Launching into the future

As space technology grows in sophistication, each of us benefits from the advances, and chances are those applications will subtly continue to appear in the technologies we use daily. After all, space is the ultimate testing ground — if something can survive there, it can survive anywhere.

We’re already seeing the fruits of rocket science in the reliability of technology, and this has directly spurred advancement in data integrity and data storage. Data integrity advances have spurred smarter intelligence in applications from smart cars to enterprise software, and data storage advances have completely transformed industries such as transportation, driving advances in “traveling data centers.”

While we might take some of these takeaways for granted, hopefully, these innovations excite and inspire us – as more advances are made, they will elevate our daily experiences in turn. 

Yaniv Iarovici is the director of IoT segment marketing at Western Digital.

Originally appeared on: TheSpuzz