Apr
16
2021
--

Add Microsoft Azure Monitoring Within Percona Monitoring and Management 2.16.0

microsoft azure percona monitoring and management

microsoft azure percona monitoring and managementThe Microsoft Azure SQL Database is among the most popular databases of 2020, according to DB-Engine’s DBMS of the Year award. Also, it’s steadily climbing up in DB-Engines Ranking. The ranking is updated monthly and places database management systems according to their popularity. In case you didn’t know, DB-Engines is an initiative to collect and present information on database management systems (DBMS).

So we are excited to share that you can now monitor Azure instances in the Percona Monitoring and Management 2.16.0 (PMM) release. PMM can collect Azure DB metrics as well as available system metrics.

Only basics metrics are provided by Azure Portal.


No Disk, virtual CPU, or RAM data are available in PMM dashboards. Here is an example of a home page with a monitored Azure service. It’s shown in the middle row.

Microsoft Azure Monitoring Within Percona Monitoring and Management
DB metrics are collected by exporters from services directly. It allows you to have all possible metrics. You can find some screenshots of MySQL and PostgreSQL dashboards at the end of this blog post.

Simple Steps to Add an Azure DB Service and Get Metrics in PMM

  • The feature is a technical preview and has to be enabled on the Setting page. Turning this feature OFF will not remove added Services from monitoring, it will just hide the ability to dscover and add new Microsoft Azure Services.

Add an Azure DB Service PMM
This feature is a technical preview because we are releasing it as soon as possible to get some feedback from users. We are expecting to do more work on this feature, to make it more API and resource-efficient.

  • Go to page “Add Instance” (Configuration … PMM Inventory … Add instance)

  • Press the button “Microsoft Azure MySQL or PostgreSQL” and fill in the requested Azure and DB credentials.

Microsoft Azure MySQL or PostgreSQL

Please follow the link “Where do I get the security credentials for my Azure DB instance” if some credential parameters are missing.

Also, please keep in mind that a separate node will be created for each service. It’s named as a service hostname and can’t be changed. But you may specify a service name when adding service details. By default, node and service names are equal.

That’s it. You may go to the list of dashboards and observe collected data.

If you are a Microsoft Azure user or going to become one, please give Percona Monitoring and Management a test run.  We are always open to suggestions and propositions.  Please contact us, leave a message on our forum, or join the slack channel.

Here are screenshots of the “MySQL Instance Summary” and “PostgreSQL Instance Summary” dashboards for Azure instances.

 

Read more about the release of Percona Monitoring and Management 2.16 and all the exciting new features included with it!

Percona Live ONLINE, the open source database conference, is coming up quickly! Registration is now OPEN… and FREE! 

Mar
02
2021
--

Microsoft launches Azure Percept, its new hardware and software platform to bring AI to the edge

Microsoft today announced Azure Percept, its new hardware and software platform for bringing more of its Azure AI services to the edge. Percept combines Microsoft’s Azure cloud tools for managing devices and creating AI models with hardware from Microsoft’s device partners. The general idea here is to make it far easier for all kinds of businesses to build and implement AI for things like object detection, anomaly detections, shelf analytics and keyword spotting at the edge by providing them with an end-to-end solution that takes them from building AI models to deploying them on compatible hardware.

To kickstart this, Microsoft also today launches a hardware development kit with an intelligent camera for vision use cases (dubbed Azure Percept Vision). The kit features hardware-enabled AI modules for running models at the edge, but it can also be connected to the cloud. Users will also be able to trial their proofs-of-concept in the real world because the development kit conforms to the widely used 80/20 T-slot framing architecture.

In addition to Percept Vision, Microsoft is also launching Azure Percept Audio for audio-centric use cases.

Azure Percept devices, including Trust Platform Module, Azure Percept Vision and Azure Percept Audio

Azure Percept devices, including Trust Platform Module, Azure Percept Vision and Azure Percept Audio

“We’ve started with the two most common AI workloads, vision and voice, sight and sound, and we’ve given out that blueprint so that manufacturers can take the basics of what we’ve started,” said Roanne Sones, the corporate vice president of Microsoft’s edge and platform group, said. “But they can envision it in any kind of responsible form factor to cover a pattern of the world.”

Percept customers will have access to Azure’s cognitive service and machine learning models and Percept devices will automatically connect to Azure’s IoT hub.

Microsoft says it is working with silicon and equipment manufacturers to build an ecosystem of “intelligent edge devices that are certified to run on the Azure Percept platform.” Over the course of the next few months, Microsoft plans to certify third-party devices for inclusion in this program, which will ideally allow its customers to take their proofs-of-concept and easily deploy them to any certified devices.

“Anybody who builds a prototype using one of our development kits, if they buy a certified device, they don’t have to do any additional work,” said Christa St. Pierre, a product manager in Microsoft’s Azure edge and platform group.

St. Pierre also noted that all of the components of the platform will have to conform to Microsoft’s responsible AI principles — and go through extensive security testing.


Early Stage is the premiere ‘how-to’ event for startup entrepreneurs and investors. You’ll hear firsthand how some of the most successful founders and VCs build their businesses, raise money and manage their portfolios. We’ll cover every aspect of company-building: Fundraising, recruiting, sales, legal, PR, marketing and brand building. Each session also has audience participation built-in — there’s ample time included in each for audience questions and discussion.


Mar
02
2021
--

Microsoft Azure expands its NoSQL portfolio with Managed Instances for Apache Cassandra

At its Ignite conference today, Microsoft announced the launch of Azure Managed Instance for Apache Cassandra, its latest NoSQL database offering and a competitor to Cassandra-centric companies like Datastax. Microsoft describes the new service as a ‘semi-managed offering that will help companies bring more of their Cassandra-based workloads into its cloud.

“Customers can easily take on-prem Cassandra workloads and add limitless cloud scale while maintaining full compatibility with the latest version of Apache Cassandra,” Microsoft explains in its press materials. “Their deployments gain improved performance and availability, while benefiting from Azure’s security and compliance capabilities.”

Like its counterpart, Azure SQL Manages Instance, the idea here is to give users access to a scalable, cloud-based database service. To use Cassandra in Azure before, businesses had to either move to Cosmos DB, its highly scalable database service which supports the Cassandra, MongoDB, SQL and Gremlin APIs, or manage their own fleet of virtual machines or on-premises infrastructure.

Cassandra was originally developed at Facebook and then open-sourced in 2008. A year later, it joined the Apache Foundation and today it’s used widely across the industry, with companies like Apple and Netflix betting on it for some of their core services, for example. AWS launched a managed Cassandra-compatible service at its re:Invent conference in 2019 (it’s called Amazon Keyspaces today), Microsoft launched the Cassandra API for Cosmos DB in September 2018. With today’s announcement, though, the company can now offer a full range of Cassandra-based servicer for enterprises that want to move these workloads to its cloud.


Early Stage is the premiere ‘how-to’ event for startup entrepreneurs and investors. You’ll hear firsthand how some of the most successful founders and VCs build their businesses, raise money and manage their portfolios. We’ll cover every aspect of company-building: Fundraising, recruiting, sales, legal, PR, marketing and brand building. Each session also has audience participation built-in — there’s ample time included in each for audience questions and discussion.


Feb
17
2021
--

Microsoft’s Dapr open-source project to help developers build cloud-native apps hits 1.0

Dapr, the Microsoft-incubated open-source project that aims to make it easier for developers to build event-driven, distributed cloud-native applications, hit its 1.0 milestone today, signifying the project’s readiness for production use cases. Microsoft launched the Distributed Application Runtime (that’s what “Dapr” stand for) back in October 2019. Since then, the project released 14 updates and the community launched integrations with virtually all major cloud providers, including Azure, AWS, Alibaba and Google Cloud.

The goal for Dapr, Microsoft Azure CTO Mark Russinovich told me, was to democratize cloud-native development for enterprise developers.

“When we go look at what enterprise developers are being asked to do — they’ve traditionally been doing client, server, web plus database-type applications,” he noted. “But now, we’re asking them to containerize and to create microservices that scale out and have no-downtime updates — and they’ve got to integrate with all these cloud services. And many enterprises are, on top of that, asking them to make apps that are portable across on-premises environments as well as cloud environments or even be able to move between clouds. So just tons of complexity has been thrown at them that’s not specific to or not relevant to the business problems they’re trying to solve.”

And a lot of the development involves re-inventing the wheel to make their applications reliably talk to various other services. The idea behind Dapr is to give developers a single runtime that, out of the box, provides the tools that developers need to build event-driven microservices. Among other things, Dapr provides various building blocks for things like service-to-service communications, state management, pub/sub and secrets management.

Image Credits: Dapr

“The goal with Dapr was: let’s take care of all of the mundane work of writing one of these cloud-native distributed, highly available, scalable, secure cloud services, away from the developers so they can focus on their code. And actually, we took lessons from serverless, from Functions-as-a-Service where with, for example Azure Functions, it’s event-driven, they focus on their business logic and then things like the bindings that come with Azure Functions take care of connecting with other services,” Russinovich said.

He also noted that another goal here was to do away with language-specific models and to create a programming model that can be leveraged from any language. Enterprises, after all, tend to use multiple languages in their existing code, and a lot of them are now looking at how to best modernize their existing applications — without throwing out all of their current code.

As Russinovich noted, the project now has more than 700 contributors outside of Microsoft (though the core commuters are largely from Microsoft) and a number of businesses started using it in production before the 1.0 release. One of the larger cloud providers that is already using it is Alibaba. “Alibaba Cloud has really fallen in love with Dapr and is leveraging it heavily,” he said. Other organizations that have contributed to Dapr include HashiCorp and early users like ZEISS, Ignition Group and New Relic.

And while it may seem a bit odd for a cloud provider to be happy that its competitors are using its innovations already, Russinovich noted that this was exactly the plan and that the team hopes to bring Dapr into a foundation soon.

“We’ve been on a path to open governance for several months and the goal is to get this into a foundation. […] The goal is opening this up. It’s not a Microsoft thing. It’s an industry thing,” he said — but he wasn’t quite ready to say to which foundation the team is talking.

 

Dec
22
2020
--

With a $50B run rate in reach, can anyone stop AWS?

AWS, Amazon’s flourishing cloud arm, has been growing at a rapid clip for more than a decade. An early public cloud infrastructure vendor, it has taken advantage of first-to-market status to become the most successful player in the space. In fact, one could argue that many of today’s startups wouldn’t have gotten off the ground without the formation of cloud companies like AWS giving them easy access to infrastructure without having to build it themselves.

In Amazon’s most-recent earnings report, AWS generated revenues of $11.6 billion, good for a run rate of more than $46 billion. That makes the next AWS milestone a run rate of $50 billion, something that could be in reach in less than two quarters if it continues its pace of revenue growth.

The good news for competing companies is that in spite of the market size and relative maturity, there is still plenty of room to grow.

While the cloud division’s growth is slowing in percentage terms as it comes firmly up against the law of large numbers in which AWS has to grow every quarter compared to an ever-larger revenue base. The result of this dynamic is that while AWS’ year-over-year growth rate is slowing over time — from 35% in Q3 2019 to 29% in Q3 2020 — the pace at which it is adding $10 billion chunks of annual revenue run rate is accelerating.

At the AWS re:Invent customer conference this year, AWS CEO Andy Jassy talked about the pace of change over the years, saying that it took the following number of months to grow its run rate by $10 billion increments:

123 months ($0-$10 billion) 23 months ($10 billion-$20 billion) 13 months ($20 billion-$30 billion) 12 months ($30 billion to $40 billion)

Image Credits: TechCrunch (data from AWS)

Extrapolating from the above trend, it should take AWS fewer than 12 months to scale from a run rate of $40 billion to $50 billion. Stating the obvious, Jassy said “the rate of growth in AWS continues to accelerate.” He also took the time to point out that AWS is now the fifth-largest enterprise IT company in the world, ahead of enterprise stalwarts like SAP and Oracle.

What’s amazing is that AWS achieved its scale so fast, not even existing until 2006. That growth rate makes us ask a question: Can anyone hope to stop AWS’ momentum?

The short answer is that it doesn’t appear likely.

Cloud market landscape

A good place to start is surveying the cloud infrastructure competitive landscape to see if there are any cloud companies that could catch the market leader. According to Synergy Research, AWS remains firmly in front, and it doesn’t look like any competitor could catch AWS anytime soon unless some market dynamic caused a drastic change.

Synergy Research Cloud marketshare leaders. Amazon is first, Microsoft is second and Google is third.

Image Credits: Synergy Research

With around a third of the market, AWS is the clear front-runner. Its closest and fiercest rival Microsoft has around 20%. To put that into perspective a bit, last quarter AWS had $11.6 billion in revenue compared to Microsoft’s $5.2 billion Azure result. While Microsoft’s equivalent cloud number is growing faster at 47%, like AWS, that number has begun to drop steadily while it gains market share and higher revenue and it falls victim to that same law of large numbers.

Sep
22
2020
--

Microsoft challenges Twilio with the launch of Azure Communication Services

Microsoft today announced the launch of Azure Communication Services, a new set of features in its cloud that enable developers to add voice and video calling, chat and text messages to their apps, as well as old-school telephony.

The company describes the new set of services as the “first fully managed communication platform offering from a major cloud provider,” and that seems right, given that Google and AWS offer some of these features, including the AWS notification service, for example, but not as part of a cohesive communication service. Indeed, it seems Azure Communication Service is more of a competitor to the core features of Twilio or up-and-coming MessageBird.

Over the course of the last few years, Microsoft has built up a lot of experience in this area, in large parts thanks to the success of its Teams service. Unsurprisingly, that’s something Microsoft is also playing up in its announcement.

“Azure Communication Services is built natively on top a global, reliable cloud — Azure. Businesses can confidently build and deploy on the same low latency global communication network used by Microsoft Teams to support over 5 billion meeting minutes in a single day,” writes Scott Van Vliet, corporate vice president for Intelligent Communication at the company.

Microsoft also stresses that it offers a set of additional smart services that developers can tap into to build out their communication services, including its translation tools, for example. The company also notes that its services are encrypted to meet HIPPA and GDPR standards.

Like similar services, developers access the various capabilities through a set of new APIs and SDKs.

As for the core services, the capabilities here are pretty much what you’d expect. There’s voice and video calling (and the ability to shift between them). There’s support for chat and, starting in October, users will also be able to send text messages. Microsoft says developers will be able to send these to users anywhere, with Microsoft positioning it as a global service.

Provisioning phone numbers, too, is part of the services and developers will be able to provision those for in-bound and out-bound calls, port existing numbers, request new ones and — most importantly for contact-center users — integrate them with existing on-premises equipment and carrier networks.

“Our goal is to meet businesses where they are and provide solutions to help them be resilient and move their business forward in today’s market,” writes Van Vliet. “We see rich communication experiences – enabled by voice, video, chat, and SMS – continuing to be an integral part in how businesses connect with their customers across devices and platforms.”

Sep
22
2020
--

Microsoft brings data services to its Arc multi-cloud management service

Microsoft today launched a major update to its Arc multi-cloud service that allows Azure customers to run and manage workloads across clouds — including those of Microsoft’s competitors — and their on-premises data centers. First announced at Microsoft Ignite in 2019, Arc was always meant to not just help users manage their servers but also allow them to run data services like Azure SQL and Azure Database for PostgreSQL, close to where their data sits.

Today, the company is making good on this promise with the preview launch of Azure Arc-enabled data services with support for, as expected, Azure SQL and Azure Database for PostgreSQL.

In addition, Microsoft is making the core feature of Arc, Arc-enabled servers, generally available. These are the tools at the core of the service that allow enterprises that use the standard Azure Portal to manage and monitor their Windows and Linux servers across their multi-cloud and edge environments.

Image Credits: Microsoft

“We’ve always known that enterprises are looking to unlock the agility of the cloud — they love the app model, they love the business model — while balancing a need to maintain certain applications and workloads on premises,” Rohan Kumar, Microsoft’s corporate VP for Azure Data said. “A lot of customers actually have a multi-cloud strategy. In some cases, they need to keep the data specifically for regulatory compliance. And in many cases, they want to maximize their existing investments. They’ve spent a lot of CapEx.”

As Kumar stressed, Microsoft wants to meet customers where they are, without forcing them to adopt a container architecture, for example, or replace their specialized engineered appliances to use Arc.

“Hybrid is really [about] providing that flexible choice to our customers, meeting them where they are, and not prescribing a solution,” he said.

He admitted that this approach makes engineering the solution more difficult, but the team decided the baseline should be a container endpoint and nothing more. And for the most part, Microsoft packaged up the tools its own engineers were already using to run Azure services on the company’s own infrastructure to manage these services in a multi-cloud environment.

“In hindsight, it was a little challenging at the beginning, because, you can imagine, when we initially built them, we didn’t imagine that we’ll be packaging them like this. But it’s a very modern design point,” Kumar said. But the result is that supporting customers is now relatively easy because it’s so similar to what the team does in Azure, too.

Kumar noted that one of the selling points for the Azure Data Services is also that the version of Azure SQL is essentially evergreen, allowing them to stop worrying about SQL Server licensing and end-of-life support questions.

Jul
14
2020
--

Google Cloud’s new BigQuery Omni will let developers query data in GCP, AWS and Azure

At its virtual Cloud Next ’20 event, Google today announced a number of updates to its cloud portfolio, but the private alpha launch of BigQuery Omni is probably the highlight of this year’s event. Powered by Google Cloud’s Anthos hybrid-cloud platform, BigQuery Omni allows developers to use the BigQuery engine to analyze data that sits in multiple clouds, including those of Google Cloud competitors like AWS and Microsoft Azure — though for now, the service only supports AWS, with Azure support coming later.

Using a unified interface, developers can analyze this data locally without having to move data sets between platforms.

“Our customers store petabytes of information in BigQuery, with the knowledge that it is safe and that it’s protected,” said Debanjan Saha, the GM and VP of Engineering for Data Analytics at Google Cloud, in a press conference ahead of today’s announcement. “A lot of our customers do many different types of analytics in BigQuery. For example, they use the built-in machine learning capabilities to run real-time analytics and predictive analytics. […] A lot of our customers who are very excited about using BigQuery in GCP are also asking, ‘how can they extend the use of BigQuery to other clouds?’ ”

Image Credits: Google

Google has long said that it believes that multi-cloud is the future — something that most of its competitors would probably agree with, though they all would obviously like you to use their tools, even if the data sits in other clouds or is generated off-platform. It’s the tools and services that help businesses to make use of all of this data, after all, where the different vendors can differentiate themselves from each other. Maybe it’s no surprise then, given Google Cloud’s expertise in data analytics, that BigQuery is now joining the multi-cloud fray.

“With BigQuery Omni customers get what they wanted,” Saha said. “They wanted to analyze their data no matter where the data sits and they get it today with BigQuery Omni.”

Image Credits: Google

He noted that Google Cloud believes that this will help enterprises break down their data silos and gain new insights into their data, all while allowing developers and analysts to use a standard SQL interface.

Today’s announcement is also a good example of how Google’s bet on Anthos is paying off by making it easier for the company to not just allow its customers to manage their multi-cloud deployments but also to extend the reach of its own products across clouds. This also explains why BigQuery Omni isn’t available for Azure yet, given that Anthos for Azure is still in preview, while AWS support became generally available in April.

May
27
2020
--

Docker expands relationship with Microsoft to ease developer experience across platforms

When Docker sold off its enterprise division to Mirantis last fall, that didn’t mark the end of the company. In fact, Docker still exists and has refocused as a cloud-native developer tools vendor. Today it announced an expanded partnership with Microsoft around simplifying running Docker containers in Azure.

As its new mission suggests, it involves tighter integration between Docker and a couple of Azure developer tools including Visual Studio Code and Azure Container Instances (ACI). According to Docker, it can take developers hours or even days to set up their containerized environment across the two sets of tools.

The idea of the integration is to make it easier, faster and more efficient to include Docker containers when developing applications with the Microsoft tool set. Docker CEO Scott Johnston says it’s a matter of giving developers a better experience.

“Extending our strategic relationship with Microsoft will further reduce the complexity of building, sharing and running cloud-native, microservices-based applications for developers. Docker and VS Code are two of the most beloved developer tools and we are proud to bring them together to deliver a better experience for developers building container-based apps for Azure Container Instances,” Johnston said in a statement.

Among the features they are announcing is the ability to log into Azure directly from the Docker command line interface, a big simplification that reduces going back and forth between the two sets of tools. What’s more, developers can set up a Microsoft ACI environment complete with a set of configuration defaults. Developers will also be able to switch easily between their local desktop instance and the cloud to run applications.

These and other integrations are designed to make it easier for Azure and Docker common users to work in in the Microsoft cloud service without having to jump through a lot of extra hoops to do it.

It’s worth noting that these integrations are starting in Beta, but the company promises they should be released some time in the second half of this year.

May
19
2020
--

Microsoft launches Azure Synapse Link to help enterprises get faster insights from their data

At its Build developer conference, Microsoft today announced Azure Synapse Link, a new enterprise service that allows businesses to analyze their data faster and more efficiently, using an approach that’s generally called “hybrid transaction/analytical processing” (HTAP). That’s a mouthful; it essentially enables enterprises to use the same database system for analytical and transactional workloads on a single system. Traditionally, enterprises had to make some trade-offs between either building a single system for both that was often highly over-provisioned or maintain separate systems for transactional and analytics workloads.

Last year, at its Ignite conference, Microsoft announced Azure Synapse Analytics, an analytics service that combines analytics and data warehousing to create what the company calls “the next evolution of Azure SQL Data Warehouse.” Synapse Analytics brings together data from Microsoft’s services and those from its partners and makes it easier to analyze.

“One of the key things, as we work with our customers on their digital transformation journey, there is an aspect of being data-driven, of being insights-driven as a culture, and a key part of that really is that once you decide there is some amount of information or insights that you need, how quickly are you able to get to that? For us, time to insight and a secondary element, which is the cost it takes, the effort it takes to build these pipelines and maintain them with an end-to-end analytics solution, was a key metric we have been observing for multiple years from our largest enterprise customers,” said Rohan Kumar, Microsoft’s corporate VP for Azure Data.

Synapse Link takes the work Microsoft did on Synaps Analytics a step further by removing the barriers between Azure’s operational databases and Synapse Analytics, so enterprises can immediately get value from the data in those databases without going through a data warehouse first.

“What we are announcing with Synapse Link is the next major step in the same vision that we had around reducing the time to insight,” explained Kumar. “And in this particular case, a long-standing barrier that exists today between operational databases and analytics systems is these complex ETL (extract, transform, load) pipelines that need to be set up just so you can do basic operational reporting or where, in a very transactionally consistent way, you need to move data from your operational system to the analytics system, because you don’t want to impact the performance of the operational system in any way because that’s typically dealing with, depending on the system, millions of transactions per second.”

ETL pipelines, Kumar argued, are typically expensive and hard to build and maintain, yet enterprises are now building new apps — and maybe even line of business mobile apps — where any action that consumers take and that is registered in the operational database is immediately available for predictive analytics, for example.

From the user perspective, enabling this only takes a single click to link the two, while it removes the need for managing additional data pipelines or database resources. That, Kumar said, was always the main goal for Synapse Link. “With a single click, you should be able to enable real-time analytics on your operational data in ways that don’t have any impact on your operational systems, so you’re not using the compute part of your operational system to do the query, you actually have to transform the data into a columnar format, which is more adaptable for analytics, and that’s really what we achieved with Synapse Link.”

Because traditional HTAP systems on-premises typically share their compute resources with the operational database, those systems never quite took off, Kumar argued. In the cloud, with Synapse Link, though, that impact doesn’t exist because you’re dealing with two separate systems. Now, once a transaction gets committed to the operational database, the Synapse Link system transforms the data into a columnar format that is more optimized for the analytics system — and it does so in real time.

For now, Synapse Link is only available in conjunction with Microsoft’s Cosmos DB database. As Kumar told me, that’s because that’s where the company saw the highest demand for this kind of service, but you can expect the company to add support for available in Azure SQL, Azure Database for PostgreSQL and Azure Database for MySQL in the future.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com