Microsoft debuts Azure Space to cater to the space industry, partners with SpaceX for Starlink data center broadband

Microsoft is taking its Azure cloud computing platform to the final frontier — space. It now has a dedicated business unit called Azure Space for that purpose, made up of industry heavyweights and engineers who are focused on space-sector services, including simulation of space missions, gathering and interpreting satellite data to provide insights and providing global satellite networking capabilities through new and expanded partnerships.

One of Microsoft’s new partners for Azure Space is SpaceX, the progenitor and major current player in the so-called “New Space” industry. SpaceX will be providing Microsoft with access to its Starlink low-latency satellite-based broadband network for Microsoft’s new Azure Modular Datacenter (MDC) — essentially an on-demand container-based data center unit that can be deployed in remote locations, either to operate on their own or boost local capabilities.

Image Credits: Microsoft

The MDC is a contained unit, and can operate off-grid using its own satellite network connectivity add-on. It’s similar in concept to the company’s work on underwater data centres, but keeping it on the ground obviously opens up more opportunities in terms of locating it where people need it, rather than having to be proximate to an ocean or sea.

The other big part of this announcement focuses on space preparedness via simulation. Microsoft revealed the Azure Orbital Emulator today, which provides in a computer emulated environment the ability to test satellite constellation operations in simulation, using both software and hardware. It’s basically aiming to provide as close to in-space conditions as are possible on the ground in order to get everything ready for coordinating large, interconnected constellations of automated satellites in low Earth orbit, an increasing need as more defense agencies and private companies pursue this approach versus the legacy method of relying on one, two or just a few large geosynchronous spacecraft.

Image Credits: Microsoft

Microsoft says the goal with the Orbital Emulator is to train AI for use on orbital spacecraft before those spacecraft are actually launched — from the early development phase, right up to working with production hardware on the ground before it takes its trip to space. That’s definitely a big potential competitive advantage, because it should help companies spot even more potential problems early on while they’re still relatively easy to fix (not the case on orbit).

This emulated environment for on-orbit mission prep is already in use by Azure Government customers, the company notes. It’s also looking for more partners across government and industry for space-related services, including communication, national security, satellite services including observation and telemetry and more.


HPE and NASA make supercomputer on ISS available for experiments

Last year, HPE successfully built and installed a supercomputer on the International Space Station that could withstand the rigors of being in space. Today, the company announced that it is making that computer available for earth-based developers and scientists to conduct experiments.

Mark Fernandez, who has the lofty title of America’s HPC Technology Officer at HPE, says that the project was born with the idea that if we eventually go to Mars, we will need computers that can withstand the travel conditions of being in space for extended periods of time.

What’s more, because space computers have traditionally lacked the sophistication of earth-based computers, they conduct some of the work in space and then complete the calculations on earth. With an eye toward a Mars trip, this approach would not be feasible due to the distances and latency that would be involved. They needed a computer that could handle processing at the edge (in place) without sending data back to earth.

The original idea was to build a supercomputer with the state of the art off-the-shelf parts as and install it on the ISS as an experiment to see if this could work. They built the one teraflop computer in the summer of 2017 and launched it into space on a SpaceX rocket. The computer was built with Intel Broadwell processors, which Fernandez says were the best available at the time.

The first step was to see if the computer they built could handle the launch, the cold temperatures of waiting to be on-boarded, the solar radiation and generally uncommon conditions of being in space.

Once installed, they needed to figure out if this computer could operate in the power and cooling environment available onboard the ISS, which is not close to what you would have in earth-based datacenter with a highly controlled environment. Finally, once installed, would the computer operate correctly and give accurate answers.

The special sauce here was a package of software they call Hardened with Software. “We wrote a thin, lightweight way suite of software to quote-unquote, harden our systems of software, so you can take state of the art with you,” he said.

The computer was launched in August 2017 and has been operating ever since, and Fernandez says that it has worked according to plan. “So we’ve achieved our signed, dated and contracted mission. We have a one teraflop supercomputer on board the International Space Station with Intel Broadwell processors.” He says that supercomputer has flown around the earth 6000 times since launch.

The company now wants to open this computer up as a kind of service to earth-based developers and scientists to experiment with high-latency jobs that would have required some processing on earth. With the HPE Spaceborne Computer available to use, they can see what processing this information at the edge would be like (and if it would work). The computer will be in operation until some time next year, and in the meantime interested parties need to apply to HPE and NASA to get involved.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com