Apr
30
2021
--

Analytics as a service: Why more enterprises should consider outsourcing

With an increasing number of enterprise systems, growing teams, a rising proliferation of the web and multiple digital initiatives, companies of all sizes are creating loads of data every day. This data contains excellent business insights and immense opportunities, but it has become impossible for companies to derive actionable insights from this data consistently due to its sheer volume.

According to Verified Market Research, the analytics-as-a-service (AaaS) market is expected to grow to $101.29 billion by 2026. Organizations that have not started on their analytics journey or are spending scarce data engineer resources to resolve issues with analytics implementations are not identifying actionable data insights. Through AaaS, managed services providers (MSPs) can help organizations get started on their analytics journey immediately without extravagant capital investment.

MSPs can take ownership of the company’s immediate data analytics needs, resolve ongoing challenges and integrate new data sources to manage dashboard visualizations, reporting and predictive modeling — enabling companies to make data-driven decisions every day.

AaaS could come bundled with multiple business-intelligence-related services. Primarily, the service includes (1) services for data warehouses; (2) services for visualizations and reports; and (3) services for predictive analytics, artificial intelligence (AI) and machine learning (ML). When a company partners with an MSP for analytics as a service, organizations are able to tap into business intelligence easily, instantly and at a lower cost of ownership than doing it in-house. This empowers the enterprise to focus on delivering better customer experiences, be unencumbered with decision-making and build data-driven strategies.

Organizations that have not started on their analytics journey or are spending scarce data engineer resources to resolve issues with analytics implementations are not identifying actionable data insights.

In today’s world, where customers value experiences over transactions, AaaS helps businesses dig deeper into their psyche and tap insights to build long-term winning strategies. It also enables enterprises to forecast and predict business trends by looking at their data and allows employees at every level to make informed decisions.

Apr
30
2021
--

The health data transparency movement is birthing a new generation of startups

In the early 2000s, Jeff Bezos gave a seminal TED Talk titled “The Electricity Metaphor for the Web’s Future.” In it, he argued that the internet will enable innovation on the same scale that electricity did.

We are at a similar inflection point in healthcare, with the recent movement toward data transparency birthing a new generation of innovation and startups.

Those who follow the space closely may have noticed that there are twin struggles taking place: a push for more transparency on provider and payer data, including anonymous patient data, and another for strict privacy protection for personal patient data. What’s the main difference?

This sector is still somewhat nascent — we are in the first wave of innovation, with much more to come.

Anonymized data is much more freely available, while personal data is being locked even tighter (as it should be) due to regulations like GDPR, CCPA and their equivalents around the world.

The former trend is enabling a host of new vendors and services that will ultimately make healthcare better and more transparent for all of us.

These new companies could not have existed five years ago. The Affordable Care Act was the first step toward making anonymized data more available. It required healthcare institutions (such as hospitals and healthcare systems) to publish data on costs and outcomes. This included the release of detailed data on providers.

Later legislation required biotech and pharma companies to disclose monies paid to research partners. And every physician in the U.S. is now required to be in the National Practitioner Identifier (NPI), a comprehensive public database of providers.

All of this allowed the creation of new types of companies that give both patients and providers more control over their data. Here are some key examples of how.

Allowing patients to access all their own health data in one place

This is a key capability of patients’ newly found access to health data. Think of how often, as a patient, providers aren’t aware of treatment or a test you’ve had elsewhere. Often you end up repeating a test because a provider doesn’t have a record of a test conducted elsewhere.

Apr
29
2021
--

Healthcare is the next wave of data liberation

Why can we see all our bank, credit card and brokerage data on our phones instantaneously in one app, yet walk into a doctor’s office blind to our healthcare records, diagnoses and prescriptions? Our health status should be as accessible as our checking account balance.

The liberation of financial data enabled by startups like Plaid is beginning to happen with healthcare data, which will have an even more profound impact on society; it will save and extend lives. This accessibility is quickly approaching.

As early investors in Quovo and PatientPing, two pioneering companies in financial and healthcare data, respectively, it’s evident to us the winners of the healthcare data transformation will look different than they did with financial data, even as we head toward a similar end state.

For over a decade, government agencies and consumers have pushed for this liberation.

This push for greater data liquidity coincides with demand from consumers for better information about cost and quality.

In 2009, the Health Information Technology for Economic and Clinical Health Act (HITECH) gave the first big industry push, catalyzing a wave of digitization through electronic health records (EHR). Today, over 98% of medical records are digitized. This market is dominated by multibillion?dollar vendors like Epic, Cerner and Allscripts, which control 70% of patient records. However, these giant vendors have yet to make these records easily accessible.

A second wave of regulation has begun to address the problem of trapped data to make EHRs more interoperable and valuable. Agencies within the Department of Health and Human Services have mandated data sharing among payers and providers using a common standard, the Fast Healthcare Interoperability Resources (FHIR) protocol.

Image Credits: F-Prime Capital

This push for greater data liquidity coincides with demand from consumers for better information about cost and quality. Employers have been steadily shifting a greater share of healthcare expenses to consumers through high-deductible health plans — from 30% in 2012 to 51% in 2018. As consumers pay for more of the costs, they care more about the value of different health options, yet are unable to make those decisions without real-time access to cost and clinical data.

Image Credits: F-Prime Capital

Tech startups have an opportunity to ease the transmission of healthcare data and address the push of regulation and consumer demands. The lessons from fintech make it tempting to assume that a Plaid for healthcare data would be enough to address all of the challenges within healthcare, but it is not the right model. Plaid’s aggregator model benefited from a relatively high concentration of banks, a limited number of data types and low barriers to data access.

By contrast, healthcare data is scattered across tens of thousands of healthcare providers, stored in multiple data formats and systems per provider, and is rarely accessed by patients directly. Many people log into their bank apps frequently, but few log into their healthcare provider portals, if they even know one exists.

HIPPA regulations and strict patient consent requirements also meaningfully increase friction to data access and sharing. Financial data serves mostly one-to-one use cases, while healthcare data is a many-to-many problem. A single patient’s data is spread across many doctors and facilities and is needed by just as many for care coordination.

Because of this landscape, winning healthcare technology companies will need to build around four propositions:

Apr
21
2021
--

4 ways martech will shift in 2021

The tidal wave of growth is upon us — an unprecedented economic boom that will manifest later this year, bringing significant investments, acquisitions and customer growth. But most tech companies and startups are not adequately prepared to capitalize on the opportunity that lies ahead.

Here’s how marketing in tech will shift — and what you need to know to reach more customers and accelerate growth in 2021.

First and foremost, differentiation is going to be imperative. It’s already hard enough to stand out and get noticed, and it’s about to get much more difficult as new companies emerge and investments and budgets balloon in the latter half of the year. Virtually all major companies are increasing budgets to pre-pandemic levels, but will delay those investments until the second half of the year. This will result in an increased intensity of competition that will drown out any undifferentiated players.

The second half of 2021 will bring incredible growth, the likes of which we haven’t seen in a long time.

Additionally, tech companies need to be mindful not to ignore the most important part of the ecosystem: people. Technology will only take you so far, and it’s not going to be enough to survive the competition. Marketing is about people, including your customers, team, partners, investors and the broader community.

Understanding who your people are and how you can use their help to build a strong foundation and drive exponential growth is essential.

Tactically, the most successful tech companies will embrace video and experimentation in their marketing — two components that will catapult them ahead of the competition.

Ignoring these predictions, backed by empirical evidence, will be detrimental and devastating. Fasten your seatbelts: 2021 is going to be a turbocharged year of growth opportunities for marketing in tech.

Differentiation is crucial

The explosion of tech companies and startups seeking to be the next big thing isn’t over yet. However, many of them are indistinguishable from each other and lack a compelling value proposition. Just one look at the websites of new and existing tech companies will reveal a proliferation of buzzwords and conceptual illustrations, leaving them all looking and sounding alike.

The tech companies that succeed are those that embrace one of the fundamentals of effective marketing — positioning.

In the ’80s, Al Ries and Jack Trout published “Positioning: The Battle For Your Mind” and coined the term, which documented the best-known approach to standing out in a noisy marketplace. As the market heats up, companies will realize the need to sharpen their positioning and dial in their focus to break through the noise.

To get attention and build traction, companies need to establish a position they can own. The mashup method — “Netflix but for coding lessons” — is not real positioning; it’s simply a lazy gimmick.

It is imperative to identify who your ideal customer is and not just who could use your product. Focusing on a segment of the market rather than the whole is, perhaps counterintuitively, the most effective approach to capturing the larger market.

Apr
16
2021
--

Data scientists: Bring the narrative to the forefront

By 2025, 463 exabytes of data will be created each day, according to some estimates. (For perspective, one exabyte of storage could hold 50,000 years of DVD-quality video.) It’s now easier than ever to translate physical and digital actions into data, and businesses of all types have raced to amass as much data as possible in order to gain a competitive edge.

However, in our collective infatuation with data (and obtaining more of it), what’s often overlooked is the role that storytelling plays in extracting real value from data.

The reality is that data by itself is insufficient to really influence human behavior. Whether the goal is to improve a business’ bottom line or convince people to stay home amid a pandemic, it’s the narrative that compels action, rather than the numbers alone. As more data is collected and analyzed, communication and storytelling will become even more integral in the data science discipline because of their role in separating the signal from the noise.

Data alone doesn’t spur innovation — rather, it’s data-driven storytelling that helps uncover hidden trends, powers personalization, and streamlines processes.

Yet this can be an area where data scientists struggle. In Anaconda’s 2020 State of Data Science survey of more than 2,300 data scientists, nearly a quarter of respondents said that their data science or machine learning (ML) teams lacked communication skills. This may be one reason why roughly 40% of respondents said they were able to effectively demonstrate business impact “only sometimes” or “almost never.”

The best data practitioners must be as skilled in storytelling as they are in coding and deploying models — and yes, this extends beyond creating visualizations to accompany reports. Here are some recommendations for how data scientists can situate their results within larger contextual narratives.

Make the abstract more tangible

Ever-growing datasets help machine learning models better understand the scope of a problem space, but more data does not necessarily help with human comprehension. Even for the most left-brain of thinkers, it’s not in our nature to understand large abstract numbers or things like marginal improvements in accuracy. This is why it’s important to include points of reference in your storytelling that make data tangible.

For example, throughout the pandemic, we’ve been bombarded with countless statistics around case counts, death rates, positivity rates, and more. While all of this data is important, tools like interactive maps and conversations around reproduction numbers are more effective than massive data dumps in terms of providing context, conveying risk, and, consequently, helping change behaviors as needed. In working with numbers, data practitioners have a responsibility to provide the necessary structure so that the data can be understood by the intended audience.

Apr
16
2021
--

Enterprise security attackers are one password away from your worst day

If the definition of insanity is doing the same thing over and over and expecting a different outcome, then one might say the cybersecurity industry is insane.

Criminals continue to innovate with highly sophisticated attack methods, but many security organizations still use the same technological approaches they did 10 years ago. The world has changed, but cybersecurity hasn’t kept pace.

Distributed systems, with people and data everywhere, mean the perimeter has disappeared. And the hackers couldn’t be more excited. The same technology approaches, like correlation rules, manual processes and reviewing alerts in isolation, do little more than remedy symptoms while hardly addressing the underlying problem.

The current risks aren’t just technology problems; they’re also problems of people and processes.

Credentials are supposed to be the front gates of the castle, but as the SOC is failing to change, it is failing to detect. The cybersecurity industry must rethink its strategy to analyze how credentials are used and stop breaches before they become bigger problems.

It’s all about the credentials

Compromised credentials have long been a primary attack vector, but the problem has only grown worse in the midpandemic world. The acceleration of remote work has increased the attack footprint as organizations struggle to secure their network while employees work from unsecured connections. In April 2020, the FBI said that cybersecurity attacks reported to the organization grew by 400% compared to before the pandemic. Just imagine where that number is now in early 2021.

It only takes one compromised account for an attacker to enter the active directory and create their own credentials. In such an environment, all user accounts should be considered as potentially compromised.

Nearly all of the hundreds of breach reports I’ve read have involved compromised credentials. More than 80% of hacking breaches are now enabled by brute force or the use of lost or stolen credentials, according to the 2020 Data Breach Investigations Report. The most effective and commonly-used strategy is credential stuffing attacks, where digital adversaries break in, exploit the environment, then move laterally to gain higher-level access.

Mar
16
2021
--

A crypto company’s journey to Data 3.0

Data is a gold mine for a company.

If managed well, it provides the clarity and insights that lead to better decision-making at scale, in addition to an important tool to hold everyone accountable.

However, most companies are stuck in Data 1.0, which means they are leveraging data as a manual and reactive service. Some have started moving to Data 2.0, which employs simple automation to improve team productivity. The complexity of crypto data has opened up new opportunities in data, namely to move to the new frontier of Data 3.0, where you can scale value creation through systematic intelligence and automation. This is our journey to Data 3.0.

The complexity of crypto data has opened up new opportunities in data, namely to move to the new frontier of Data 3.0, where you can scale value creation through systematic intelligence and automation.

Coinbase is neither a finance company nor a tech company — it’s a crypto company. This distinction has big implications for how we work with data. As a crypto company, we work with three major types of data (instead of the usual one or two types of data), each of which is complex and varied:

  1. Blockchain: decentralized and publicly available.
  2. Product: large and real-time.
  3. Financial: high-precision and subject to many financial/legal/compliance regulations.

Image Credits: Michael Li/Coinbase

Our focus has been on how we can scale value creation by making this varied data work together, eliminating data silos, solving issues before they start and creating opportunities for Coinbase that wouldn’t exist otherwise.

Having worked at tech companies like LinkedIn and eBay, and also those in the finance sector, including Capital One, I’ve observed firsthand the evolution from Data 1.0 to Data 3.0. In Data 1.0, data is seen as a reactive function providing ad-hoc manual services or firefighting in urgent situations.

Feb
22
2021
--

Winning enterprise sales teams know how to persuade the Chief Objection Officer

Many enterprise software startups at some point have faced the invisible wall. For months, your sales team has done everything right. They’ve met with a prospect several times, provided them with demos, free trials, documentation and references, and perhaps even signed a provisional contract.

The stars are all aligned and then, suddenly, the deal falls apart. Someone has put the kibosh on the entire project. Who is this deal-blocker and what can software companies do to identify, support and convince this person to move forward with a contract?

I call this person the Chief Objection Officer.

Who is this deal-blocker and what can software companies do to identify, support and convince this person to move forward with a contract?

Most software companies spend a lot of time and effort identifying their potential buyers and champions within an organization. They build personas and do targeted marketing to these individuals and then fine-tune their products to meet their needs. These targets may be VPs of engineering, data leaders, CTOs, CISOs, CMOs or anyone else with decision-making authority. But what most software companies neglect to do during this exploratory phase is to identify the person who may block the entire deal.

This person is the anti-champion with the power to scuttle a potential partnership. Like your potential deal-makers, these deal-breakers can have any title with decision-making power. Chief Objection Officers aren’t simply potential buyers who end up deciding your product is not the right fit, but are instead blockers-in-chief who can make departmentwide or companywide decisions. Thus, it’s critical for software companies to identify the Chief Objection Officers that might block deals and, then, address their concerns.

So how do you identify the Chief Objection Officer? The trick is to figure out the main pain points that arise for companies when considering deploying your solution, and then walk backward to figure out which person these challenges impact the most. Here are some common pain points that your potential customers may face when considering your product.

Change is hard. Never underestimate the power of the status quo. Does implementing your product in one part of an organization, such as IT, force another department, such as HR, to change how they do their daily jobs?

Think about which leaders will be most reluctant to make changes; these Chief Objection Officers will likely not be your buyers, but instead the heads of departments most impacted by the implementation of your software. For example, a marketing team may love the ad targeting platform they use and thus a CMO will balk at new database software that would limit or change the way customer segment data is collected. Or field sales would object to new security infrastructure software that makes it harder for them to access the company network from their phones. The head of the department that will bear the brunt of change will often be a Chief Objection Officer.

Is someone’s job on the line?

Another common pain point when deploying a new software solution is that one or more jobs may become obsolete once it’s up and running. Perhaps your software streamlines and outsources most of a company’s accounts payable processes. Maybe your SaaS solution will replace an on-premise homegrown one that a team of developers has built and nurtured for years.

Feb
18
2021
--

Why do SaaS companies with usage-based pricing grow faster?

Today we know of HubSpot — the maker of marketing, sales and service software products — as a preeminent public company with a market cap above $17 billion. But HubSpot wasn’t always on the IPO trajectory.

For its first five years in business, HubSpot offered three subscription packages ranging in price from $3,000 to $18,000 per year. The company struggled with poor churn and anemic expansion revenue. Net revenue retention was near 70%, a far cry from the 100%+ that most SaaS companies aim to achieve.

Something needed to change. So in 2011, they introduced usage-based pricing. As customers used the software to generate more leads, they would proportionally increase their spend with HubSpot.  This pricing change allowed HubSpot to share in the success of its customers.

In a usage-based model, expansion “just happens” as customers are successful.

By the time HubSpot went public in 2014, net revenue retention had jumped to nearly 100% — all without hurting the company’s ability to acquire new customers.

HubSpot isn’t an outlier. Public SaaS companies that have adopted usage-based pricing grow faster because they’re better at landing new customers, growing with them and keeping them as customers.

Image Credits: Kyle Poyar

Widen the top of the funnel

In a usage-based model, a company doesn’t get paid until after the customer has adopted the product. From the customer’s perspective, this means that there’s no risk to try before they buy. Products like Snowflake and Google Cloud Platform take this a step further and even offer $300+ in free usage credits for new developers to test drive their products.

Many of these free users won’t become profitable — and that’s okay. Like a VC firm, usage-based companies are making a portfolio of bets. Some of those will pay off spectacularly — and the company will directly share in that success.

Top-performing companies open up the top of the funnel by making it free to sign up for their products. They invest in a frictionless customer onboarding experience and high-quality support so that new users get hooked on the platform. As more new users become active, there’s a stronger foundation for future customer growth.

Jan
29
2021
--

Subscription-based pricing is dead: Smart SaaS companies are shifting to usage-based models

Software buying has evolved. The days of executives choosing software for their employees based on IT compatibility or KPIs are gone. Employees now tell their boss what to buy. This is why we’re seeing more and more SaaS companies — Datadog, Twilio, AWS, Snowflake and Stripe, to name a few — find success with a usage-based pricing model.

The usage-based model allows a customer to start at a low cost, while still preserving the ability to monetize a customer over time.

The usage-based model allows a customer to start at a low cost, minimizing friction to getting started while still preserving the ability to monetize a customer over time because the price is directly tied with the value a customer receives. Not limiting the number of users who can access the software, customers are able to find new use cases — which leads to more long-term success and higher lifetime value.

While we aren’t going 100% usage-based overnight, looking at some of the megatrends in software —  automation, AI and APIs — the value of a product normally doesn’t scale with more logins. Usage-based pricing will be the key to successful monetization in the future. Here are four top tips to help companies scale to $100+ million ARR with this model.

1. Land-and-expand is real

Usage-based pricing is in all layers of the tech stack. Though it was pioneered in the infrastructure layer (think: AWS and Azure), it’s becoming increasingly popular for API-based products and application software — across infrastructure, middleware and applications.

API-based products and appliacation software – across infrastructure, middleware and applications.

Image Credits: Kyle Poyar / OpenView

Some fear that investors will hate usage-based pricing because customers aren’t locked into a subscription. But, investors actually see it as a sign that customers are seeing value from a product and there’s no shelf-ware.

In fact, investors are increasingly rewarding usage-based companies in the market. Usage-based companies are trading at a 50% revenue multiple premium over their peers.

Investors especially love how the usage-based pricing model pairs with the land-and-expand business model. And of the IPOs over the last three years, seven of the nine that had the best net dollar retention all have a usage-based model. Snowflake in particular is off the charts with a 158% net dollar retention.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com