Dec
14
2022
--

Generate a pgBadger Report From PostgreSQL Flexible Server Logs

pgBadger Report From PostgreSQL Flexible Server Logs

In one of our previous posts Detailed Logging for Enterprise-Grade PostgreSQL, we discussed parameters to enable detailed logging and use Log Analyzer – pgBadger. In this blog post, we will configure a Microsoft Azure provisioned PostgreSQL Flexi Server to populate logs and generate a pgBadger report.

The latest pgBadger Utility provides support for JSON format logs. Microsoft Azure PostgreSQL Flexi Server does not provide PostgreSQL logs as we used to get with a single server or on-premises environment. It will get populated after enabling it in JSON Format.

In this blog, we will configure and generate a pgBadger report using JSON format logs and if we are using an older version of pgBadger utility, then convert it into regular logs.

Before downloading we need to tune parameters related to logging in the PostgreSQL.conf file and reload the configuration. You can download and install pgBadger from here.

Also, you can go through here and navigate to pgBadger to know more about it.

Configuration

From the Microsoft Azure Cloud console — https://portal.azure.com/#home  — we need to create a storage account as shown below:

Microsoft Azure Cloud console

Click on the CREATE option and fill in details like name and resource group as shown below:

Configure existing PostgreSQL Flexi Server to use the storage account to generate PostgreSQL Logs.

Select diagnostic settings and add the already-created storage account.

PostgreSQL storage account

 

Login into Microsoft Azure Cloud, navigate to the storage account and navigate to the respective storage account that has been created for PostgreSQL Flexible logs. Navigate to the location of the logs as shown below:

Navigate to the date and time to choose the hourly JSON file(s) required. Right-click on the .json file and download the log which will be in JSON format.

Sample .json logs look like the below:

 

Generate pgBadger report

Use the Jump server provisioned for pgBadger, and copy the JSON file from the local machine to the Jump server.

If you are using the latest pgBadger utility, you can pass the JSON format logs using the -f option to generate a pgBadger report:

pgbadger --prefix='%t-%c-' -f jsonlog PT1H.json -o pgbadger_report.html

If you do not have the option to use the latest pgBadger utility, then use the below Linux command to extract PostgreSQL logs from JSON File and generate a postgresql.log file.

cut -f9- -d\: PT1H.json| cut -f1 -d\} | sed -e 's/^."//' -e 's/"$//' > postgresql.log

Generate the PgBadger report from the postgresql.log file and parse it into an HTML file:

pgbadger --prefix='%t-%c-' postgresql.log -o pgbadger_report.html

Copy the pgbadger_report.html from the Jump server side to the local machine and review the PgBadger report.

Conclusion

The pgBadger utility is continuously emerging as the best log analyzer tool with each release, as it adds more features and functionalities. We can configure and generate pgBadger reports from Microsoft Azure’s Flexi server logs and it does make a DBA’s life easier! ?

Apr
16
2021
--

Add Microsoft Azure Monitoring Within Percona Monitoring and Management 2.16.0

microsoft azure percona monitoring and management

microsoft azure percona monitoring and managementThe Microsoft Azure SQL Database is among the most popular databases of 2020, according to DB-Engine’s DBMS of the Year award. Also, it’s steadily climbing up in DB-Engines Ranking. The ranking is updated monthly and places database management systems according to their popularity. In case you didn’t know, DB-Engines is an initiative to collect and present information on database management systems (DBMS).

So we are excited to share that you can now monitor Azure instances in the Percona Monitoring and Management 2.16.0 (PMM) release. PMM can collect Azure DB metrics as well as available system metrics.

Only basics metrics are provided by Azure Portal.


No Disk, virtual CPU, or RAM data are available in PMM dashboards. Here is an example of a home page with a monitored Azure service. It’s shown in the middle row.

Microsoft Azure Monitoring Within Percona Monitoring and Management
DB metrics are collected by exporters from services directly. It allows you to have all possible metrics. You can find some screenshots of MySQL and PostgreSQL dashboards at the end of this blog post.

Simple Steps to Add an Azure DB Service and Get Metrics in PMM

  • The feature is a technical preview and has to be enabled on the Setting page. Turning this feature OFF will not remove added Services from monitoring, it will just hide the ability to dscover and add new Microsoft Azure Services.

Add an Azure DB Service PMM
This feature is a technical preview because we are releasing it as soon as possible to get some feedback from users. We are expecting to do more work on this feature, to make it more API and resource-efficient.

  • Go to page “Add Instance” (Configuration … PMM Inventory … Add instance)

  • Press the button “Microsoft Azure MySQL or PostgreSQL” and fill in the requested Azure and DB credentials.

Microsoft Azure MySQL or PostgreSQL

Please follow the link “Where do I get the security credentials for my Azure DB instance” if some credential parameters are missing.

Also, please keep in mind that a separate node will be created for each service. It’s named as a service hostname and can’t be changed. But you may specify a service name when adding service details. By default, node and service names are equal.

That’s it. You may go to the list of dashboards and observe collected data.

If you are a Microsoft Azure user or going to become one, please give Percona Monitoring and Management a test run.  We are always open to suggestions and propositions.  Please contact us, leave a message on our forum, or join the slack channel.

Here are screenshots of the “MySQL Instance Summary” and “PostgreSQL Instance Summary” dashboards for Azure instances.

 

Read more about the release of Percona Monitoring and Management 2.16 and all the exciting new features included with it!

Percona Live ONLINE, the open source database conference, is coming up quickly! Registration is now OPEN… and FREE! 

Mar
02
2021
--

Microsoft launches Azure Percept, its new hardware and software platform to bring AI to the edge

Microsoft today announced Azure Percept, its new hardware and software platform for bringing more of its Azure AI services to the edge. Percept combines Microsoft’s Azure cloud tools for managing devices and creating AI models with hardware from Microsoft’s device partners. The general idea here is to make it far easier for all kinds of businesses to build and implement AI for things like object detection, anomaly detections, shelf analytics and keyword spotting at the edge by providing them with an end-to-end solution that takes them from building AI models to deploying them on compatible hardware.

To kickstart this, Microsoft also today launches a hardware development kit with an intelligent camera for vision use cases (dubbed Azure Percept Vision). The kit features hardware-enabled AI modules for running models at the edge, but it can also be connected to the cloud. Users will also be able to trial their proofs-of-concept in the real world because the development kit conforms to the widely used 80/20 T-slot framing architecture.

In addition to Percept Vision, Microsoft is also launching Azure Percept Audio for audio-centric use cases.

Azure Percept devices, including Trust Platform Module, Azure Percept Vision and Azure Percept Audio

Azure Percept devices, including Trust Platform Module, Azure Percept Vision and Azure Percept Audio

“We’ve started with the two most common AI workloads, vision and voice, sight and sound, and we’ve given out that blueprint so that manufacturers can take the basics of what we’ve started,” said Roanne Sones, the corporate vice president of Microsoft’s edge and platform group, said. “But they can envision it in any kind of responsible form factor to cover a pattern of the world.”

Percept customers will have access to Azure’s cognitive service and machine learning models and Percept devices will automatically connect to Azure’s IoT hub.

Microsoft says it is working with silicon and equipment manufacturers to build an ecosystem of “intelligent edge devices that are certified to run on the Azure Percept platform.” Over the course of the next few months, Microsoft plans to certify third-party devices for inclusion in this program, which will ideally allow its customers to take their proofs-of-concept and easily deploy them to any certified devices.

“Anybody who builds a prototype using one of our development kits, if they buy a certified device, they don’t have to do any additional work,” said Christa St. Pierre, a product manager in Microsoft’s Azure edge and platform group.

St. Pierre also noted that all of the components of the platform will have to conform to Microsoft’s responsible AI principles — and go through extensive security testing.


Early Stage is the premiere ‘how-to’ event for startup entrepreneurs and investors. You’ll hear firsthand how some of the most successful founders and VCs build their businesses, raise money and manage their portfolios. We’ll cover every aspect of company-building: Fundraising, recruiting, sales, legal, PR, marketing and brand building. Each session also has audience participation built-in — there’s ample time included in each for audience questions and discussion.


Mar
02
2021
--

Microsoft Azure expands its NoSQL portfolio with Managed Instances for Apache Cassandra

At its Ignite conference today, Microsoft announced the launch of Azure Managed Instance for Apache Cassandra, its latest NoSQL database offering and a competitor to Cassandra-centric companies like Datastax. Microsoft describes the new service as a ‘semi-managed offering that will help companies bring more of their Cassandra-based workloads into its cloud.

“Customers can easily take on-prem Cassandra workloads and add limitless cloud scale while maintaining full compatibility with the latest version of Apache Cassandra,” Microsoft explains in its press materials. “Their deployments gain improved performance and availability, while benefiting from Azure’s security and compliance capabilities.”

Like its counterpart, Azure SQL Manages Instance, the idea here is to give users access to a scalable, cloud-based database service. To use Cassandra in Azure before, businesses had to either move to Cosmos DB, its highly scalable database service which supports the Cassandra, MongoDB, SQL and Gremlin APIs, or manage their own fleet of virtual machines or on-premises infrastructure.

Cassandra was originally developed at Facebook and then open-sourced in 2008. A year later, it joined the Apache Foundation and today it’s used widely across the industry, with companies like Apple and Netflix betting on it for some of their core services, for example. AWS launched a managed Cassandra-compatible service at its re:Invent conference in 2019 (it’s called Amazon Keyspaces today), Microsoft launched the Cassandra API for Cosmos DB in September 2018. With today’s announcement, though, the company can now offer a full range of Cassandra-based servicer for enterprises that want to move these workloads to its cloud.


Early Stage is the premiere ‘how-to’ event for startup entrepreneurs and investors. You’ll hear firsthand how some of the most successful founders and VCs build their businesses, raise money and manage their portfolios. We’ll cover every aspect of company-building: Fundraising, recruiting, sales, legal, PR, marketing and brand building. Each session also has audience participation built-in — there’s ample time included in each for audience questions and discussion.


Feb
17
2021
--

Microsoft’s Dapr open-source project to help developers build cloud-native apps hits 1.0

Dapr, the Microsoft-incubated open-source project that aims to make it easier for developers to build event-driven, distributed cloud-native applications, hit its 1.0 milestone today, signifying the project’s readiness for production use cases. Microsoft launched the Distributed Application Runtime (that’s what “Dapr” stand for) back in October 2019. Since then, the project released 14 updates and the community launched integrations with virtually all major cloud providers, including Azure, AWS, Alibaba and Google Cloud.

The goal for Dapr, Microsoft Azure CTO Mark Russinovich told me, was to democratize cloud-native development for enterprise developers.

“When we go look at what enterprise developers are being asked to do — they’ve traditionally been doing client, server, web plus database-type applications,” he noted. “But now, we’re asking them to containerize and to create microservices that scale out and have no-downtime updates — and they’ve got to integrate with all these cloud services. And many enterprises are, on top of that, asking them to make apps that are portable across on-premises environments as well as cloud environments or even be able to move between clouds. So just tons of complexity has been thrown at them that’s not specific to or not relevant to the business problems they’re trying to solve.”

And a lot of the development involves re-inventing the wheel to make their applications reliably talk to various other services. The idea behind Dapr is to give developers a single runtime that, out of the box, provides the tools that developers need to build event-driven microservices. Among other things, Dapr provides various building blocks for things like service-to-service communications, state management, pub/sub and secrets management.

Image Credits: Dapr

“The goal with Dapr was: let’s take care of all of the mundane work of writing one of these cloud-native distributed, highly available, scalable, secure cloud services, away from the developers so they can focus on their code. And actually, we took lessons from serverless, from Functions-as-a-Service where with, for example Azure Functions, it’s event-driven, they focus on their business logic and then things like the bindings that come with Azure Functions take care of connecting with other services,” Russinovich said.

He also noted that another goal here was to do away with language-specific models and to create a programming model that can be leveraged from any language. Enterprises, after all, tend to use multiple languages in their existing code, and a lot of them are now looking at how to best modernize their existing applications — without throwing out all of their current code.

As Russinovich noted, the project now has more than 700 contributors outside of Microsoft (though the core commuters are largely from Microsoft) and a number of businesses started using it in production before the 1.0 release. One of the larger cloud providers that is already using it is Alibaba. “Alibaba Cloud has really fallen in love with Dapr and is leveraging it heavily,” he said. Other organizations that have contributed to Dapr include HashiCorp and early users like ZEISS, Ignition Group and New Relic.

And while it may seem a bit odd for a cloud provider to be happy that its competitors are using its innovations already, Russinovich noted that this was exactly the plan and that the team hopes to bring Dapr into a foundation soon.

“We’ve been on a path to open governance for several months and the goal is to get this into a foundation. […] The goal is opening this up. It’s not a Microsoft thing. It’s an industry thing,” he said — but he wasn’t quite ready to say to which foundation the team is talking.

 

Dec
22
2020
--

With a $50B run rate in reach, can anyone stop AWS?

AWS, Amazon’s flourishing cloud arm, has been growing at a rapid clip for more than a decade. An early public cloud infrastructure vendor, it has taken advantage of first-to-market status to become the most successful player in the space. In fact, one could argue that many of today’s startups wouldn’t have gotten off the ground without the formation of cloud companies like AWS giving them easy access to infrastructure without having to build it themselves.

In Amazon’s most-recent earnings report, AWS generated revenues of $11.6 billion, good for a run rate of more than $46 billion. That makes the next AWS milestone a run rate of $50 billion, something that could be in reach in less than two quarters if it continues its pace of revenue growth.

The good news for competing companies is that in spite of the market size and relative maturity, there is still plenty of room to grow.

While the cloud division’s growth is slowing in percentage terms as it comes firmly up against the law of large numbers in which AWS has to grow every quarter compared to an ever-larger revenue base. The result of this dynamic is that while AWS’ year-over-year growth rate is slowing over time — from 35% in Q3 2019 to 29% in Q3 2020 — the pace at which it is adding $10 billion chunks of annual revenue run rate is accelerating.

At the AWS re:Invent customer conference this year, AWS CEO Andy Jassy talked about the pace of change over the years, saying that it took the following number of months to grow its run rate by $10 billion increments:

123 months ($0-$10 billion) 23 months ($10 billion-$20 billion) 13 months ($20 billion-$30 billion) 12 months ($30 billion to $40 billion)

Image Credits: TechCrunch (data from AWS)

Extrapolating from the above trend, it should take AWS fewer than 12 months to scale from a run rate of $40 billion to $50 billion. Stating the obvious, Jassy said “the rate of growth in AWS continues to accelerate.” He also took the time to point out that AWS is now the fifth-largest enterprise IT company in the world, ahead of enterprise stalwarts like SAP and Oracle.

What’s amazing is that AWS achieved its scale so fast, not even existing until 2006. That growth rate makes us ask a question: Can anyone hope to stop AWS’ momentum?

The short answer is that it doesn’t appear likely.

Cloud market landscape

A good place to start is surveying the cloud infrastructure competitive landscape to see if there are any cloud companies that could catch the market leader. According to Synergy Research, AWS remains firmly in front, and it doesn’t look like any competitor could catch AWS anytime soon unless some market dynamic caused a drastic change.

Synergy Research Cloud marketshare leaders. Amazon is first, Microsoft is second and Google is third.

Image Credits: Synergy Research

With around a third of the market, AWS is the clear front-runner. Its closest and fiercest rival Microsoft has around 20%. To put that into perspective a bit, last quarter AWS had $11.6 billion in revenue compared to Microsoft’s $5.2 billion Azure result. While Microsoft’s equivalent cloud number is growing faster at 47%, like AWS, that number has begun to drop steadily while it gains market share and higher revenue and it falls victim to that same law of large numbers.

Sep
22
2020
--

Microsoft challenges Twilio with the launch of Azure Communication Services

Microsoft today announced the launch of Azure Communication Services, a new set of features in its cloud that enable developers to add voice and video calling, chat and text messages to their apps, as well as old-school telephony.

The company describes the new set of services as the “first fully managed communication platform offering from a major cloud provider,” and that seems right, given that Google and AWS offer some of these features, including the AWS notification service, for example, but not as part of a cohesive communication service. Indeed, it seems Azure Communication Service is more of a competitor to the core features of Twilio or up-and-coming MessageBird.

Over the course of the last few years, Microsoft has built up a lot of experience in this area, in large parts thanks to the success of its Teams service. Unsurprisingly, that’s something Microsoft is also playing up in its announcement.

“Azure Communication Services is built natively on top a global, reliable cloud — Azure. Businesses can confidently build and deploy on the same low latency global communication network used by Microsoft Teams to support over 5 billion meeting minutes in a single day,” writes Scott Van Vliet, corporate vice president for Intelligent Communication at the company.

Microsoft also stresses that it offers a set of additional smart services that developers can tap into to build out their communication services, including its translation tools, for example. The company also notes that its services are encrypted to meet HIPPA and GDPR standards.

Like similar services, developers access the various capabilities through a set of new APIs and SDKs.

As for the core services, the capabilities here are pretty much what you’d expect. There’s voice and video calling (and the ability to shift between them). There’s support for chat and, starting in October, users will also be able to send text messages. Microsoft says developers will be able to send these to users anywhere, with Microsoft positioning it as a global service.

Provisioning phone numbers, too, is part of the services and developers will be able to provision those for in-bound and out-bound calls, port existing numbers, request new ones and — most importantly for contact-center users — integrate them with existing on-premises equipment and carrier networks.

“Our goal is to meet businesses where they are and provide solutions to help them be resilient and move their business forward in today’s market,” writes Van Vliet. “We see rich communication experiences – enabled by voice, video, chat, and SMS – continuing to be an integral part in how businesses connect with their customers across devices and platforms.”

Sep
22
2020
--

Microsoft brings data services to its Arc multi-cloud management service

Microsoft today launched a major update to its Arc multi-cloud service that allows Azure customers to run and manage workloads across clouds — including those of Microsoft’s competitors — and their on-premises data centers. First announced at Microsoft Ignite in 2019, Arc was always meant to not just help users manage their servers but also allow them to run data services like Azure SQL and Azure Database for PostgreSQL, close to where their data sits.

Today, the company is making good on this promise with the preview launch of Azure Arc-enabled data services with support for, as expected, Azure SQL and Azure Database for PostgreSQL.

In addition, Microsoft is making the core feature of Arc, Arc-enabled servers, generally available. These are the tools at the core of the service that allow enterprises that use the standard Azure Portal to manage and monitor their Windows and Linux servers across their multi-cloud and edge environments.

Image Credits: Microsoft

“We’ve always known that enterprises are looking to unlock the agility of the cloud — they love the app model, they love the business model — while balancing a need to maintain certain applications and workloads on premises,” Rohan Kumar, Microsoft’s corporate VP for Azure Data said. “A lot of customers actually have a multi-cloud strategy. In some cases, they need to keep the data specifically for regulatory compliance. And in many cases, they want to maximize their existing investments. They’ve spent a lot of CapEx.”

As Kumar stressed, Microsoft wants to meet customers where they are, without forcing them to adopt a container architecture, for example, or replace their specialized engineered appliances to use Arc.

“Hybrid is really [about] providing that flexible choice to our customers, meeting them where they are, and not prescribing a solution,” he said.

He admitted that this approach makes engineering the solution more difficult, but the team decided the baseline should be a container endpoint and nothing more. And for the most part, Microsoft packaged up the tools its own engineers were already using to run Azure services on the company’s own infrastructure to manage these services in a multi-cloud environment.

“In hindsight, it was a little challenging at the beginning, because, you can imagine, when we initially built them, we didn’t imagine that we’ll be packaging them like this. But it’s a very modern design point,” Kumar said. But the result is that supporting customers is now relatively easy because it’s so similar to what the team does in Azure, too.

Kumar noted that one of the selling points for the Azure Data Services is also that the version of Azure SQL is essentially evergreen, allowing them to stop worrying about SQL Server licensing and end-of-life support questions.

Jul
14
2020
--

Google Cloud’s new BigQuery Omni will let developers query data in GCP, AWS and Azure

At its virtual Cloud Next ’20 event, Google today announced a number of updates to its cloud portfolio, but the private alpha launch of BigQuery Omni is probably the highlight of this year’s event. Powered by Google Cloud’s Anthos hybrid-cloud platform, BigQuery Omni allows developers to use the BigQuery engine to analyze data that sits in multiple clouds, including those of Google Cloud competitors like AWS and Microsoft Azure — though for now, the service only supports AWS, with Azure support coming later.

Using a unified interface, developers can analyze this data locally without having to move data sets between platforms.

“Our customers store petabytes of information in BigQuery, with the knowledge that it is safe and that it’s protected,” said Debanjan Saha, the GM and VP of Engineering for Data Analytics at Google Cloud, in a press conference ahead of today’s announcement. “A lot of our customers do many different types of analytics in BigQuery. For example, they use the built-in machine learning capabilities to run real-time analytics and predictive analytics. […] A lot of our customers who are very excited about using BigQuery in GCP are also asking, ‘how can they extend the use of BigQuery to other clouds?’ ”

Image Credits: Google

Google has long said that it believes that multi-cloud is the future — something that most of its competitors would probably agree with, though they all would obviously like you to use their tools, even if the data sits in other clouds or is generated off-platform. It’s the tools and services that help businesses to make use of all of this data, after all, where the different vendors can differentiate themselves from each other. Maybe it’s no surprise then, given Google Cloud’s expertise in data analytics, that BigQuery is now joining the multi-cloud fray.

“With BigQuery Omni customers get what they wanted,” Saha said. “They wanted to analyze their data no matter where the data sits and they get it today with BigQuery Omni.”

Image Credits: Google

He noted that Google Cloud believes that this will help enterprises break down their data silos and gain new insights into their data, all while allowing developers and analysts to use a standard SQL interface.

Today’s announcement is also a good example of how Google’s bet on Anthos is paying off by making it easier for the company to not just allow its customers to manage their multi-cloud deployments but also to extend the reach of its own products across clouds. This also explains why BigQuery Omni isn’t available for Azure yet, given that Anthos for Azure is still in preview, while AWS support became generally available in April.

May
27
2020
--

Docker expands relationship with Microsoft to ease developer experience across platforms

When Docker sold off its enterprise division to Mirantis last fall, that didn’t mark the end of the company. In fact, Docker still exists and has refocused as a cloud-native developer tools vendor. Today it announced an expanded partnership with Microsoft around simplifying running Docker containers in Azure.

As its new mission suggests, it involves tighter integration between Docker and a couple of Azure developer tools including Visual Studio Code and Azure Container Instances (ACI). According to Docker, it can take developers hours or even days to set up their containerized environment across the two sets of tools.

The idea of the integration is to make it easier, faster and more efficient to include Docker containers when developing applications with the Microsoft tool set. Docker CEO Scott Johnston says it’s a matter of giving developers a better experience.

“Extending our strategic relationship with Microsoft will further reduce the complexity of building, sharing and running cloud-native, microservices-based applications for developers. Docker and VS Code are two of the most beloved developer tools and we are proud to bring them together to deliver a better experience for developers building container-based apps for Azure Container Instances,” Johnston said in a statement.

Among the features they are announcing is the ability to log into Azure directly from the Docker command line interface, a big simplification that reduces going back and forth between the two sets of tools. What’s more, developers can set up a Microsoft ACI environment complete with a set of configuration defaults. Developers will also be able to switch easily between their local desktop instance and the cloud to run applications.

These and other integrations are designed to make it easier for Azure and Docker common users to work in in the Microsoft cloud service without having to jump through a lot of extra hoops to do it.

It’s worth noting that these integrations are starting in Beta, but the company promises they should be released some time in the second half of this year.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com