Oct
09
2018
--

Microsoft shows off government cloud services with JEDI due date imminent

Just a day after Google decided to drop out of the Pentagon’s massive $10 billion, 10-year JEDI cloud contract bidding, Microsoft announced increased support services for government clients. In a long blog post, the company laid out its government focused cloud services.

While today’s announcement is not directly related to JEDI per se, the timing is interesting just three days ahead of the October 12th deadline for submitting RFPs. Today’s announcement is about showing just how comprehensive the company’s government-specific cloud services are.

In a blog post, Microsoft corporate vice president for Azure, Julia White made it clear the company was focusing hard on the government business. “In the past six months we have added over 40 services and features to Azure Government, as well as publishing a new roadmap for the Azure Government regions providing ongoing transparency into our upcoming releases,” she wrote.

“Moving forward, we are simplifying our approach to regulatory compliance for federal agencies, so that our government customers can gain access to innovation more rapidly. In addition, we are adding new options for buying and onboarding cloud services to make it easier to move to the cloud. Finally, we are bringing an array of new hybrid and edge capabilities to government to ensure that government customers have full access to the technology of the intelligent edge and intelligent cloud era,” White added.

While much of the post was around the value proposition of Azure in general such as security, identity, artificial intelligence and edge data processing services, there were a slew of items aimed specifically at the government clients.

For starters, the company is increasing its FedRAMP compliance, a series of regulations designed to ensure vendors deliver cloud services securely to federal government customers. Specifically Microsoft is moving from FedRAMP moderate to high ratings on 50 services.

“By taking the broadest regulatory compliance approach in the industry, we’re making commercial innovation more accessible and easier for government to adopt,” White wrote.

In addition, Microsoft announced it’s expanding Azure Secret Regions, a solution designed specifically for dealing with highly classified information in the cloud. This one appears to take direct aim at JEDI. “We are making major progress in delivering this cloud designed to meet the regulatory and compliance requirements of the Department of Defense and the Intelligence Community. Today, we are announcing these newest regions will be available by the end of the first quarter of 2019. In addition, to meet the growing demand and requirements of the U.S. Government, we are confirming our intent to deliver Azure Government services to meet the highest classification requirements, with capabilities for handling Top Secret U.S. classified data,” White wrote.

The company’s announcements, which included many other pieces that have been previously announced, is clearly designed to show off its government chops at a time where a major government contract is up for grabs. The company announced Azure Stack for Government in August, another piece mentioned in this blog post.

Oct
04
2018
--

GitHub gets a new and improved Jira Software Cloud integration

Atlassian’s Jira has become a standard for managing large software projects in many companies. Many of those same companies also use GitHub as their source code repository and, unsurprisingly, there has long been an official way to integrate the two. That old way, however, was often slow, limited in its capabilities and unable to cope with the large code bases that many enterprises now manage on GitHub .

Almost as if to prove that GitHub remains committed to an open ecosystem, even after the Microsoft acquisition, the company today announced a new and improved integration between the two products.

“Working with Atlassian on the Jira integration was really important for us,” GitHub’s director of ecosystem engineering Kyle Daigle told me ahead of the announcement. “Because we want to make sure that our developer customers are getting the best experience of our open platform that they can have, regardless of what tools they use.”

So a couple of months ago, the team decided to build its own Jira integration from the ground up, and it’s committed to maintaining and improving it over time. As Daigle noted, the improvements here include better performance and a better user experience.

The new integration now also makes it easier to view all the pull requests, commits and branches from GitHub that are associated with a Jira issue, search for issues based on information from GitHub and see the status of the development work right in Jira, too. And because changes in GitHub trigger an update to Jira, too, that data should remain up to date at all times.

The old Jira integration over the so-called Jira DVCS connector will be deprecated and GitHub will start prompting existing users to do the upgrade over the next few weeks. The new integration is now a GitHub app, so that also comes with all of the security features the platform has to offer.

Sep
25
2018
--

Chef launches deeper integration with Microsoft Azure

DevOps automation service Chef today announced a number of new integrations with Microsoft Azure. The news, which was announced at the Microsoft Ignite conference in Orlando, Florida, focuses on helping enterprises bring their legacy applications to Azure and ranges from the public preview of Chef Automate Managed Service for Azure to the integration of Chef’s InSpec compliance product with Microsoft’s cloud platform.

With Chef Automate as a managed service on Azure, which provides ops teams with a single tool for managing and monitoring their compliance and infrastructure configurations, developers can now easily deploy and manage Chef Automate and the Chef Server from the Azure Portal. It’s a fully managed service and the company promises that businesses can get started with using it in as little as thirty minutes (though I’d take those numbers with a grain of salt).

When those configurations need to change, Chef users on Azure can also now use the Chef Workstation with Azure Cloud Shell, Azure’s command line interface. Workstation is one of Chef’s newest products and focuses on making ad-hoc configuration changes, no matter whether the node is managed by Chef or not.

And to remain in compliance, Chef is also launching an integration of its InSpec security and compliance tools with Azure. InSpec works hand in hand with Microsoft’s new Azure Policy Guest Configuration (who comes up with these names?) and allows users to automatically audit all of their applications on Azure.

“Chef gives companies the tools they need to confidently migrate to Microsoft Azure so users don’t just move their problems when migrating to the cloud, but have an understanding of the state of their assets before the migration occurs,” said Corey Scobie, the senior vice president of products and engineering at Chef, in today’s announcement. “Being able to detect and correct configuration and security issues to ensure success after migrations gives our customers the power to migrate at the right pace for their organization.”

more Microsoft Ignite 2018 coverage

Sep
24
2018
--

The 7 most important announcements from Microsoft Ignite today

Microsoft is hosting its Ignite conference in Orlando, Florida this week. And although Ignite isn’t the household name that Microsoft’s Build conference has become over the course of the last few years, it’s a massive event with over 30,000 attendees and plenty of news. Indeed, there was so much news this year that Microsoft provided the press with a 27-page booklet with all of it.

We wrote about quite a few of these today, but here are the most important announcements, including one that wasn’t in Microsoft’s booklet but was featured prominently on stage.

1. Microsoft, SAP and Adobe take on Salesforce with their new Open Data Initiative for customer data

What was announced: Microsoft is teaming up with Adobe and SAP to create a single model for representing customer data that businesses will be able to move between systems.

Why it matters: Moving customer data between different enterprise systems is hard, especially because there isn’t a standardized way to represent this information. Microsoft, Adobe and SAP say they want to make it easier for this data to flow between systems. But it’s also a shot across the bow of Salesforce, the leader in the CRM space. It also represents a chance for these three companies to enable new tools that can extract value from this data — and Microsoft obviously hopes that these businesses will choose its Azure platform for analyzing the data.


2. Microsoft wants to do away with more passwords

What was announced: Businesses that use Microsoft Azure Active Directory (AD) will now be able to use the Microsoft Authenticator app on iOS and Android in place of a password to log into their business applications.

Why it matters: Passwords are annoying and they aren’t very secure. Many enterprises are starting to push their employees to use a second factor to authenticate. With this, Microsoft now replaces the password/second factor combination with a single tap on your phone — ideally without compromising security.


3. Microsoft’s new Windows Virtual Desktop lets you run Windows 10 in the cloud

What was announced: Microsoft now lets businesses rent a virtual Windows 10 desktop in Azure.

Why it matters: Until now, virtual Windows 10 desktops were the domain of third-party service providers. Now, Microsoft itself will offer these desktops. The company argues that this is the first time you can get a multiuser virtualized Windows 10 desktop in the cloud. As employees become more mobile and don’t necessarily always work from the same desktop or laptop, this virtualized solution will allow organizations to offer them a full Windows 10 desktop in the cloud, with all the Office apps they know, without the cost of having to provision and manage a physical machine.


4. Microsoft Office gets smarter

What was announced: Microsoft is adding a number of new AI tools to its Office productivity suite. Those include Ideas, which aims to take some of the hassle out of using these tools. Ideas may suggest a layout for your PowerPoint presentation or help you find interesting data in your spreadsheets, for example. Excel is also getting a couple of new tools for pulling in rich data from third-party sources. Microsoft is also building a new unified search tool for finding data across an organization’s network.

Why it matters: Microsoft Office remains the most widely used suite of productivity applications. That makes it the ideal surface for highlighting Microsoft’s AI chops, and anything that can improve employee productivity will surely drive a lot of value to businesses. If that means sitting through fewer badly designed PowerPoint slides, then this whole AI thing will have been worth it.


5. Microsoft’s massive Surface Hub 2 whiteboards will launch in Q2 2019

What was announced: The next version of the Surface Hub, Microsoft’s massive whiteboard displays, will launch in Q2 2019. The Surface Hub 2 is both lighter and thinner than the original version. Then, in 2020, an updated version, the Surface Hub 2X, will launch that will offer features like tiling and rotation.

Why it matters: We’re talking about a 50-inch touchscreen display here. You probably won’t buy one, but you’ll want one. It’s a disappointment to hear that the Surface Hub 2 won’t launch into next year and that some of the advanced features most users are waiting for won’t arrive until the refresh in 2020.


6. Microsoft Teams gets bokeh and meeting recordings with transcripts

What was announced: Microsoft Teams, its Slack competitor, can now blur the background when you are in a video meeting and it’ll automatically create transcripts of your meetings.

Why it matters: Teams has emerged as a competent Slack competitor that’s quite popular with companies that are already betting on Microsoft’s productivity tools. Microsoft is now bringing many of its machine learning smarts to Teams to offer features that most of its competitors can’t match.


7. Microsoft launches Azure Digital Twins

What was announced: Azure Digital Twins allows enterprises to model their real-world IoT deployments in the cloud.

Why it matters: IoT presents a massive new market for cloud services like Azure. Many businesses were already building their own version of Digital Twins on top of Azure, but those homegrown solutions didn’t always scale. Now, Microsoft is offering this capability out of the box, and for many businesses, this may just be the killer feature that will make them decide on standardizing their IoT workloads on Azure. And as they use Azure Digital Twins, they’ll also want to use the rest of Azure’s many IoT tools.

more Microsoft Ignite 2018 coverage

Sep
24
2018
--

Microsoft, SAP and Adobe take on Salesforce with their new Open Data Initiative for customer data

Microsoft, SAP and Adobe today announced a new partnership: the Open Data Initiative. This alliance, which is a clear attack against Salesforce, aims to create a single data model for consumer data that is then portable between platforms. That, the companies argue, will provide more transparency and privacy controls for consumers, but the core idea here is to make it easier for enterprises to move their customers’ data around.

That data could be standard CRM data, but also information about purchase behavior and other information about customers. Right now, moving that data between platforms is often hard, given that there’s no standard way for structuring it. That’s holding back what these companies can do with their data, of course, and in this age of machine learning, data is everything.

“We want this to be an open framework,” Microsoft CEO Satya Nadella said during his keynote at the company’s annual Ignite conference. “We are very excited about the potential here about truly putting customers in control of their own data for our entire industry,” he added.

The exact details of how this is meant to work are a bit vague right now, though. Unsurprisingly, Adobe plans to use this model for its Customer Experience Platform, while Microsoft will build it into its Dynamics 365 CRM service and SAP will support it on its Hana database platform and CRM platforms, too. Underneath all of this is a single data model and then, of course, Microsoft Azure — at least on the Microsoft side.

“Adobe, Microsoft and SAP are partnering to reimagine the customer experience management category,” said Adobe CEO Shantanu Narayen. “Together we will give enterprises the ability to harness and action massive volumes of customer data to deliver personalized, real-time customer experiences at scale.”

Together, these three companies have the footprint to challenge Salesforce’s hold on the CRM market and create a new standard. SAP, especially, has put a lot of emphasis on the CRM market lately, and while that’s growing fast, it’s still far behind Salesforce.

more Microsoft Ignite 2018 coverage

Sep
24
2018
--

Microsoft Azure gets new high-performance storage options

Microsoft Azure is getting a number of new storage options today that mostly focus on use cases where disk performance matters.

The first of these is Azure Ultra SSD Managed Disks, which are now in public preview. Microsoft says that these drives will offer “sub-millisecond latency,” which unsurprisingly makes them ideal for workloads where latency matters.

Earlier this year, Microsoft launched its Premium and Standard SSD Managed Disks offerings for Azure into preview. These ‘ultra’ SSDs represent the next tier up from the Premium SSDs with even lower latency and higher throughput. They’ll offer 160,000 IOPS per second will less than a millisecond of read/write latency. These disks will come in sizes ranging from 4GB to 64TB.

And talking about Standard SSD Managed Disks, this service is now generally available after only three months in preview. To top things off, all of Azure’s storage tiers (Premium and Standard SSD, as well as Standard HDD) now offer 8, 16 and 32 TB storage capacity.

Also new today is Azure Premium files, which is now in preview. This, too, is an SSD-based service. Azure Files itself isn’t new, though. It offers users access to cloud storage using the standard SMB protocol. This new premium offering promises higher throughput and lower latency for these kind of SMB operations.

more Microsoft Ignite 2018 coverage

Sep
24
2018
--

Microsoft hopes enterprises will want to use Cortana

In a world dominated by Alexa and the Google Assistant, Cortana suffers the fate of a perfectly good alternative that nobody uses and everybody forgets about. But Microsoft wouldn’t be Microsoft if it just gave up on its investment in this space, so it’s now launching the Cortana Skills Kit for Enterprise to see if that’s a niche where Cortana can succeed.

This new kit is an end-to-end solution for enterprises that want to build their own skills and agents. Of course, they could have done this before using the existing developer tools. This kit isn’t all that different from those, after all. Microsoft notes that it is designed for deployment inside an organization and represents a new platform for them to build these experiences.

The Skills Kit platform is based on the Microsoft Bot Framework and the Azure Cognitive Services Language Understanding feature.

Overall, this is probably not a bad bet on Microsoft’s part. I can see how some enterprises would want to build their own skills for their employees and customers to access internal data, for example, or to complete routine tasks.

For now, this tool is only available in private preview. No word on when we can expect a wider launch.

more Microsoft Ignite 2018 coverage

Sep
24
2018
--

Microsoft updates its planet-scale Cosmos DB database service

Cosmos DB is undoubtedly one of the most interesting products in Microsoft’s Azure portfolio. It’s a fully managed, globally distributed multi-model database that offers throughput guarantees, a number of different consistency models and high read and write availability guarantees. Now that’s a mouthful, but basically, it means that developers can build a truly global product, write database updates to Cosmos DB and rest assured that every other user across the world will see those updates within 20 milliseconds or so. And to write their applications, they can pretend that Cosmos DB is a SQL- or MongoDB-compatible database, for example.

CosmosDB officially launched in May 2017, though in many ways it’s an evolution of Microsoft’s existing Document DB product, which was far less flexible. Today, a lot of Microsoft’s own products run on CosmosDB, including the Azure Portal itself, as well as Skype, Office 365 and Xbox.

Today, Microsoft is extending Cosmos DB with the launch of its multi-master replication feature into general availability, as well as support for the Cassandra API, giving developers yet another option to bring existing products to CosmosDB, which in this case are those written for Cassandra.

Microsoft now also promises 99.999 percent read and write availability. Previously, it’s read availability promise was 99.99 percent. And while that may not seem like a big difference, it does show that after more of a year of operating Cosmos DB with customers, Microsoft now feels more confident that it’s a highly stable system. In addition, Microsoft is also updating its write latency SLA and now promises less than 10 milliseconds at the 99th percentile.

“If you have write-heavy workloads, spanning multiple geos, and you need this near real-time ingest of your data, this becomes extremely attractive for IoT, web, mobile gaming scenarios,” Microsoft CosmosDB architect and product manager Rimma Nehme told me. She also stressed that she believes Microsoft’s SLA definitions are far more stringent than those of its competitors.

The highlight of the update, though, is multi-master replication. “We believe that we’re really the first operational database out there in the marketplace that runs on such a scale and will enable globally scalable multi-master available to the customers,” Nehme said. “The underlying protocols were designed to be multi-master from the very beginning.”

Why is this such a big deal? With this, developers can designate every region they run Cosmos DB in as a master in its own right, making for a far more scalable system in terms of being able to write updates to the database. There’s no need to first write to a single master node, which may be far away, and then have that node push the update to every other region. Instead, applications can write to the nearest region, and Cosmos DB handles everything from there. If there are conflicts, the user can decide how those should be resolved based on their own needs.

Nehme noted that all of this still plays well with CosmosDB’s existing set of consistency models. If you don’t spend your days thinking about database consistency models, then this may sound arcane, but there’s a whole area of computer science that focuses on little else but how to best handle a scenario where two users virtually simultaneously try to change the same cell in a distributed database.

Unlike other databases, Cosmos DB allows for a variety of consistency models, ranging from strong to eventual, with three intermediary models. And it actually turns out that most CosmosDB users opt for one of those intermediary models.

Interestingly, when I talked to Leslie Lamport, the Turing award winner who developed some of the fundamental concepts behind these consistency models (and the popular LaTeX document preparation system), he wasn’t all that sure that the developers are making the right choice. “I don’t know whether they really understand the consequences or whether their customers are going to be in for some surprises,” he told me. “If they’re smart, they are getting just the amount of consistency that they need. If they’re not smart, it means they’re trying to gain some efficiency and their users might not be happy about that.” He noted that when you give up strong consistency, it’s often hard to understand what exactly is happening.

But strong consistency comes with its drawbacks, too, which leads to higher latency. “For strong consistency there are a certain number of roundtrip message delays that you can’t avoid,” Lamport noted.

The CosmosDB team isn’t just building on some of the fundamental work Lamport did around databases, but it’s also making extensive use of TLA+, the formal specification language Lamport developed in the late 90s. Microsoft, as well as Amazon and others, are now training their engineers to use TLA+ to describe their algorithms mathematically before they implement them in whatever language they prefer.

“Because [CosmosDB is] a massively complicated system, there is no way to ensure the correctness of it because we are humans, and trying to hold all of these failure conditions and the complexity in any one person’s — one engineer’s — head, is impossible,” Microsoft Technical Follow Dharma Shukla noted. “TLA+ is huge in terms of getting the design done correctly, specified and validated using the TLA+ tools even before a single line of code is written. You cover all of those hundreds of thousands of edge cases that can potentially lead to data loss or availability loss, or race conditions that you had never thought about, but that two or three years ago after you have deployed the code can lead to some data corruption for customers. That would be disastrous.”

“Programming languages have a very precise goal, which is to be able to write code. And the thing that I’ve been saying over and over again is that programming is more than just coding,” Lamport added. “It’s not just coding, that’s the easy part of programming. The hard part of programming is getting the algorithms right.”

Lamport also noted that he deliberately chose to make TLA+ look like mathematics, not like another programming languages. “It really forces people to think above the code level,” Lamport noted and added that engineers often tell him that it changes the way they think.

As for those companies that don’t use TLA+ or a similar methodology, Lamport says he’s worried. “I’m really comforted that [Microsoft] is using TLA+ because I don’t see how anyone could do it without using that kind of mathematical thinking — and I worry about what the other systems that we wind up using built by other organizations — I worry about how reliable they are.”

more Microsoft Ignite 2018 coverage

Sep
24
2018
--

Microsoft wants to put your data in a box

AWS has its Snowball (and Snowmobile truck), Google Cloud has its data transfer appliance and Microsoft has its Azure Data Box. All of these are physical appliances that allow enterprises to ship lots of data to the cloud by uploading it into these machines and then shipping them to the cloud. Microsoft’s Azure Data Box launched into preview about a year ago and today, the company is announcing a number of updates and adding a few new boxes, too.

First of all, the standard 50-pound, 100-terabyte Data Box is now generally available. If you’ve got a lot of data to transfer to the cloud — or maybe collect a lot of offline data — then FedEx will happily pick this one up and Microsoft will upload the data to Azure and charge you for your storage allotment.

If you’ve got a lot more data, though, then Microsoft now also offers the Azure Data Box Heavy. This new box, which is now in preview, can hold up to one petabyte of data. Microsoft did not say how heavy the Data Box Heavy is, though.

Also new is the Azure Data Box Edge, which is now also in preview. In many ways, this is the most interesting of the additions since it goes well beyond transporting data. As the name implies, Data Box Edge is meant for edge deployments where a company collects data. What makes this version stand out is that it’s basically a small data center rack that lets you process data as it comes in. It even includes an FPGA to run AI algorithms at the edge.

Using this box, enterprises can collect the data, transform and analyze it on the box, and then send it to Azure over the network (and not in a truck). Using this, users can cut back on bandwidth cost and don’t have to send all of their data to the cloud for processing.

Also part of the same Data Box family is the Data Box Gateway. This is a virtual appliance, however, that runs on Hyper-V and VMWare and lets users create a data transfer gateway for importing data in Azure. That’s not quite as interesting as a hardware appliance but useful nonetheless.

more Microsoft Ignite 2018 coverage

Sep
24
2018
--

Microsoft Teams gets bokeh and meeting recordings with transcripts

If you’ve ever attended a video meeting and wished that the speakers used really expensive cameras and lenses that allowed for that soft classy background blur of a portrait photo, then Microsoft wants to make that wish come true. The company announced a number of updates to Microsoft Teams today, and one of those is a feature that automatically detects faces and blurs the background behind a speaker.

While background blur is nice (or at least we have to assume it will be because we haven’t been able to try it yet), the more useful new feature in Teams is intelligent recordings. Teams can now automatically generate captions and provide time-coded transcripts for the replays. This feature is coming to Office 365 commercial customers now.

Microsoft first demoed these new transcription capabilities at its Build developer conference earlier this year. In that demo, the transcription service was able to distinguish between speakers and create a real-time transcript of the meeting.

If you want to create live streams and on-demand video for a wider audience inside your company, Teams is also getting that capability next month, together with Microsoft Stream and Yammer (which seems to be lingering in the shadow of Teams these days).

more Microsoft Ignite 2018 coverage

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com