Oct
20
2020
--

Microsoft debuts Azure Space to cater to the space industry, partners with SpaceX for Starlink data center broadband

Microsoft is taking its Azure cloud computing platform to the final frontier — space. It now has a dedicated business unit called Azure Space for that purpose, made up of industry heavyweights and engineers who are focused on space-sector services, including simulation of space missions, gathering and interpreting satellite data to provide insights and providing global satellite networking capabilities through new and expanded partnerships.

One of Microsoft’s new partners for Azure Space is SpaceX, the progenitor and major current player in the so-called “New Space” industry. SpaceX will be providing Microsoft with access to its Starlink low-latency satellite-based broadband network for Microsoft’s new Azure Modular Datacenter (MDC) — essentially an on-demand container-based data center unit that can be deployed in remote locations, either to operate on their own or boost local capabilities.

Image Credits: Microsoft

The MDC is a contained unit, and can operate off-grid using its own satellite network connectivity add-on. It’s similar in concept to the company’s work on underwater data centres, but keeping it on the ground obviously opens up more opportunities in terms of locating it where people need it, rather than having to be proximate to an ocean or sea.

The other big part of this announcement focuses on space preparedness via simulation. Microsoft revealed the Azure Orbital Emulator today, which provides in a computer emulated environment the ability to test satellite constellation operations in simulation, using both software and hardware. It’s basically aiming to provide as close to in-space conditions as are possible on the ground in order to get everything ready for coordinating large, interconnected constellations of automated satellites in low Earth orbit, an increasing need as more defense agencies and private companies pursue this approach versus the legacy method of relying on one, two or just a few large geosynchronous spacecraft.

Image Credits: Microsoft

Microsoft says the goal with the Orbital Emulator is to train AI for use on orbital spacecraft before those spacecraft are actually launched — from the early development phase, right up to working with production hardware on the ground before it takes its trip to space. That’s definitely a big potential competitive advantage, because it should help companies spot even more potential problems early on while they’re still relatively easy to fix (not the case on orbit).

This emulated environment for on-orbit mission prep is already in use by Azure Government customers, the company notes. It’s also looking for more partners across government and industry for space-related services, including communication, national security, satellite services including observation and telemetry and more.

May
19
2020
--

Microsoft launches Project Bonsai, its new machine teaching service for building autonomous systems

At its Build developer conference, Microsoft today announced that Project Bonsai, its new machine teaching service, is now in public preview.

If that name sounds familiar, it’s probably because you remember that Microsoft acquired Bonsai, a company that focuses on machine teaching, back in 2018. Bonsai combined simulation tools with different machine learning techniques to build a general-purpose deep reinforcement learning platform, with a focus on industrial control systems.

It’s maybe no surprise then that Project Bonsai, too, has a similar focus on helping businesses teach and manage their autonomous machines. “With Project Bonsai, subject-matter experts can add state-of-the-art intelligence to their most dynamic physical systems and processes without needing a background in AI,” the company notes in its press materials.

“The public preview of Project Bonsai builds on top of the Bonsai acquisition and the autonomous systems private preview announcements made at Build and Ignite of last year,” a Microsoft spokesperson told me.

Interestingly, Microsoft notes that project Bonsai is only the first block of a larger vision to help its customers build these autonomous systems. The company also stresses the advantages of machine teaching over other machine learning approach, especially the fact that it’s less of a black box approach than other methods, which makes it easier for developers and engineers to debug systems that don’t work as expected.

In addition to Bonsai, Microsoft also today announced Project Moab, an open-source balancing robot that is meant to help engineers and developers learn the basics of how to build a real-world control system. The idea here is to teach the robot to keep a ball balanced on top of a platform that is held by three arms.

Potential users will be able to either 3D print the robot themselves or buy one when it goes on sale later this year. There is also a simulation, developed by MathWorks, that developers can try out immediately.

“You can very quickly take it into areas where doing it in traditional ways would not be easy, such as balancing an egg instead,” said Mark Hammond, Microsoft General Manager
for Autonomous Systems. “The point of the Project Moab system is to provide that
playground where engineers tackling various problems can learn how to use the tooling and simulation models. Once they understand the concepts, they can apply it to their novel use case.”

Nov
07
2019
--

How Microsoft is trying to become more innovative

Microsoft Research is a globally distributed playground for people interested in solving fundamental science problems.

These projects often focus on machine learning and artificial intelligence, and since Microsoft is on a mission to infuse all of its products with more AI smarts, it’s no surprise that it’s also seeking ways to integrate Microsoft Research’s innovations into the rest of the company.

Across the board, the company is trying to find ways to become more innovative, especially around its work in AI, and it’s putting processes in place to do so. Microsoft is unusually open about this process, too, and actually made it somewhat of a focus this week at Ignite, a yearly conference that typically focuses more on technical IT management topics.

At Ignite, Microsoft will for the first time present these projects externally at a dedicated keynote. That feels similar to what Google used to do with its ATAP group at its I/O events and is obviously meant to showcase the cutting-edge innovation that happens inside of Microsoft (outside of making Excel smarter).

To manage its AI innovation efforts, Microsoft created the Microsoft AI group led by VP Mitra Azizirad, who’s tasked with establishing thought leadership in this space internally and externally, and helping the company itself innovate faster (Microsoft’s AI for Good projects also fall under this group’s purview). I sat down with Azizirad to get a better idea of what her team is doing and how she approaches getting companies to innovate around AI and bring research projects out of the lab.

“We began to put together a narrative for the company of what it really means to be in an AI-driven world and what we look at from a differentiated perspective,” Azizirad said. “What we’ve done in this area is something that has resonated and landed well. And now we’re including AI, but we’re expanding beyond it to other paradigm shifts like human-machine interaction, future of computing and digital responsibility, as more than just a set of principles and practices but an area of innovation in and of itself.”

Currently, Microsoft is doing a very good job at talking and thinking about horizon one opportunities, as well as horizon three projects that are still years out, she said. “Horizon two, we need to get better at, and that’s what we’re doing.”

It’s worth stressing that Microsoft AI, which launched about two years ago, marks the first time there’s a business, marketing and product management team associated with Microsoft Research, so the team does get a lot of insights into upcoming technologies. Just in the last couple of years, Microsoft has published more than 6,000 research papers on AI, some of which clearly have a future in the company’s products.

Sep
25
2019
--

QC Ware Forge will give developers access to quantum hardware and simulators across vendors

Quantum computing is almost ready for prime time, and, according to most experts, now is the time to start learning how to best develop for this new and less than intuitive technology. With multiple vendors like D-Wave, Google, IBM, Microsoft and Rigetti offering commercial and open-source hardware solutions, simulators and other tools, there’s already a lot of fragmentation in this business. QC Ware, which is launching its Forge cloud platform into beta today, wants to become the go-to middleman for accessing the quantum computing hardware and simulators of these vendors.

Forge, which like the rest of QC Ware’s efforts is aimed at enterprise users, will give developers the ability to run their algorithms on a variety of hardware platforms and simulators. The company argues that developers won’t need to have any previous expertise in quantum computing, though having a bit of background surely isn’t going to hurt. From Forge’s user interface, developers will be able to run algorithms for binary optimization, chemistry simulation and machine learning.

Screen Shot 2019 09 19 at 2.16.37 PM

“Practical quantum advantage will occur. Most experts agree that it’s a matter of ‘when’ not ‘if.’ The way to pull that horizon closer is by having the user community fully engaged in quantum computing application discovery. The objective of Forge is to allow those users to access the full range of quantum computing resources through a single platform,” said Matt Johnson, CEO, QC Ware. “To assist our customers in that exploration, we are spending all of our cycles working on ways to squeeze as much power as possible out of near-term quantum computers, and to bake those methods into Forge.”

Currently, QC Ware Forge offers access to hardware from D-Wave, as well as open-source simulators running on Google’s and IBM’s clouds, with plans to support a wider variety of platforms in the near future.

Initially, QC Ware also told me that it offered direct access to IBM’s hardware, but that’s not yet the case. “We currently have the integration complete and actively utilized by QC Ware developers and quantum experts,”  QC Ware’s head of business development Yianni Gamvros told me. “However, we are still working with IBM to put an agreement in place in order for our end-users to directly access IBM hardware. We expect that to be available in our next major release. For users, this makes it easier for them to deal with the churn. We expect different hardware vendors will lead at different times and that will keep changing every six months. And for our quantum computing hardware vendors, they have a channel partner they can sell through.”

Users who sign up for the beta will receive 30 days of access to the platform and one minute of actual Quantum Computing Time to evaluate the platform.

Jun
24
2016
--

A running tab of what tech people think about whether we’re living in a simulation

elon musk code confere Are we living in a simulation? For whatever reason, this is a hot topic in Silicon Valley these days. It all more or less started when Tesla Motors CEO (and soon to be SolarCity CEO — check one off for the simulation argument there) Elon Musk made a claim at the Code Conference that there’s such a high chance that we’re living in a simulation that it’s more likely we… Read More

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com