Sep
19
2018
--

Einstein Voice gives Salesforce users gift of gab

Salespeople usually spend their days talking. They are on the phone and in meetings, but when it comes to updating Salesforce, they are back at the keyboard again typing notes and milestones, or searching for metrics about their performance. Today, Salesforce decided to change that by introducing Einstein Voice, a bit of AI magic that allows salespeople to talk to the program instead of typing.

In a world where Amazon Alexa and Siri make talking to our devices more commonplace in our non-work lives, it makes sense that companies are trying to bring that same kind of interaction to work.

In this case, you can conversationally enter information about a meeting, get daily briefings about key information on your day’s meetings (particularly nice for salespeople who spend their day in the car) and interact with Salesforce data dashboards by asking questions instead of typing queries.

All of these tools are designed to make life easier for busy salespeople. Most hate doing the administrative part of their jobs because if they are entering information, even if it will benefit them having a record in the long run, they are not doing their primary job, which is selling stuff.

For the meetings notes part, instead of typing on a smartphone, which can be a challenge anyway, you simply touch Meeting Debrief in the Einstein Voice mobile tool and start talking to enter your notes. The tool interprets what you’re saying. As with most transcription services, this is probably not perfect and will require some correcting, but should get you most of the way there.

It can also pick out key data like dates and deal amounts and let you set action items to follow up on.

Gif: Salesforce

Brent Leary, who is the founder and principal analyst at CRM Essentials says this is a natural progression for Salesforce as people get more comfortable using voice interfaces. “I think this will make voice-first devices and assistants as important pieces to the CRM puzzle from both a customer experience and an employee productivity perspective,” he told TechCrunch.

It’s worth pointing out that Tact.AI has been giving Salesforce users these kind of voice services for some time, and Tact CEO Chuck Ganapathi doesn’t seem too concerned about Salesforce jumping in.

“Conversational AI is the future of enterprise software and it’s not a question of if or when. It’s all about the how, and we strongly believe that a Switzerland strategy is the only way to deliver on its promise. It’s no wonder we are the only company to be backed by Microsoft, Amazon and Salesforce,” he said.

Leary things there’s plenty of room for everyone and Salesforce getting involved will accelerate adoption for all players. “The Salesforce tide will lift all boats, and companies like Tact will see their profile increased significantly because while Salesforce is the leader in the category, its share of the market is still less than 20% of the market.”

Einstein is Salesforce’s catch-all brand for its artificial intelligence layer. In this case it’s using natural language processing, voice recognition technology and other artificial intelligence pieces to interpret the person’s voice and transcribe what they are saying or understand their request better.

Typically, Salesforce starts with a small set of functionality and the builds on that over time. That’s very likely what they are doing here, coming out with a product announcement in time for Dreamforce, their massive customer conference next week,

Sep
19
2018
--

Fresh out of Y Combinator, Leena AI scores $2M seed round

Leena AI, a recent Y Combinator graduate focusing on HR chatbots to help employees answer questions like how much vacation time they have left, announced a $2 million seed round today from a variety of investors including Elad Gil and Snapdeal co-founders Kunal Bahl and Rohit Bansal.

Company co-founder and CEO Adit Jain says the seed money is about scaling the company and gaining customers. They hope to have 50 enterprise customers within the next 12-18 months. They currently have 16.

We wrote about the company in June when it was part of the Y Combinator Summer 2018 class. At the time Jain explained that they began in 2015 in India as a company called Chatteron. The original idea was to help others build chatbots, but like many startups, they realized there was a need not being addressed, in this case around HR, and they started Leena AI last year to focus specifically on that.

As they delved deeper into the HR problem, they found most employees had trouble getting answers to basic questions like how much vacation time they had or how to get a new baby on their health insurance. This forced a call to a help desk when the information was available online, but not always easy to find.

Jain pointed out that most HR policies are defined in policy documents, but employees don’t always know where they are. They felt a chatbot would be a good way to solve this problem and save a lot of time searching or calling for answers that should be easily found. What’s more, they learned that the vast majority of questions are fairly common and therefore easier for a system to learn.

Employees can access the Leena chatbot in Slack, Workplace by Facebook, Outlook, Skype for Business, Microsoft Teams and Cisco Spark. They also offer Web and mobile access to their service independent of these other tools.

Photo: Leena AI

What’s more, since most companies use a common set of backend HR systems like those from Oracle, SAP and NetSuite (also owned by Oracle), they have been able to build a set of standard integrators that are available out of the box with their solution.

The customer provides Leena with a handbook or a set of policy documents and they put their machine learning to work on that. Jain says, armed with this information, they can convert these documents into a structured set of questions and answers and feed that to the chatbot. They apply Natural Language Processing (NLP) to understand the question being asked and provide the correct answer.

They see room to move beyond HR and expand into other departments such as IT, finance and vendor procurement that could also take advantage of bots to answer a set of common questions. For now, as a recent YC graduate, they have their first bit of significant funding and they will concentrate on building HR chatbots and see where that takes them.

Sep
19
2018
--

IBM launches cloud tool to detect AI bias and explain automated decisions

IBM has launched a software service that scans AI systems as they work in order to detect bias and provide explanations for the automated decisions being made — a degree of transparency that may be necessary for compliance purposes not just a company’s own due diligence.

The new trust and transparency system runs on the IBM cloud and works with models built from what IBM bills as a wide variety of popular machine learning frameworks and AI-build environments — including its own Watson tech, as well as Tensorflow, SparkML, AWS SageMaker, and AzureML.

It says the service can be customized to specific organizational needs via programming to take account of the “unique decision factors of any business workflow”.

The fully automated SaaS explains decision-making and detects bias in AI models at runtime — so as decisions are being made — which means it’s capturing “potentially unfair outcomes as they occur”, as IBM puts it.

It will also automatically recommend data to add to the model to help mitigate any bias that has been detected.

Explanations of AI decisions include showing which factors weighted the decision in one direction vs another; the confidence in the recommendation; and the factors behind that confidence.

IBM also says the software keeps records of the AI model’s accuracy, performance and fairness, along with the lineage of the AI systems — meaning they can be “easily traced and recalled for customer service, regulatory or compliance reasons”.

For one example on the compliance front, the EU’s GDPR privacy framework references automated decision making, and includes a right for people to be given detailed explanations of how algorithms work in certain scenarios — meaning businesses may need to be able to audit their AIs.

The IBM AI scanner tool provides a breakdown of automated decisions via visual dashboards — an approach it bills as reducing dependency on “specialized AI skills”.

However it is also intending its own professional services staff to work with businesses to use the new software service. So it will be both selling AI, ‘a fix’ for AI’s imperfections, and experts to help smooth any wrinkles when enterprises are trying to fix their AIs… Which suggests that while AI will indeed remove some jobs, automation will be busy creating other types of work.

Nor is IBM the first professional services firm to spot a business opportunity around AI bias. A few months ago Accenture outed a fairness tool for identifying and fixing unfair AIs.

So with a major push towards automation across multiple industries there also looks to be a pretty sizeable scramble to set up and sell services to patch any problems that arise as a result of increasing use of AI.

And, indeed, to encourage more businesses to feel confident about jumping in and automating more. (On that front IBM cites research it conducted which found that while 82% of enterprises are considering AI deployments, 60% fear liability issues and 63% lack the in-house talent to confidently manage the technology.)

In additional to launching its own (paid for) AI auditing tool, IBM says its research division will be open sourcing an AI bias detection and mitigation toolkit — with the aim of encouraging “global collaboration around addressing bias in AI”.

“IBM led the industry in establishing trust and transparency principles for the development of new AI technologies. It’s time to translate principles into practice,” said David Kenny, SVP of cognitive solutions at IBM, commenting in a statement. “We are giving new transparency and control to the businesses who use AI and face the most potential risk from any flawed decision making.”

Sep
18
2018
--

Microsoft launches new AI applications for customer service and sales

Like virtually every other major tech company, Microsoft is currently on a mission to bring machine learning to all of its applications. It’s no surprise then that it’s also bringing ‘AI’ to its highly profitable Dynamics 365 CRM products. A year ago, the company introduced its first Dynamics 365 AI solutions and today it’s expanding this portfolio with the launch of three new products: Dynamics 365 AI for Sales, Customer Service and Market Insights.

“Many people, when they talk about CRM, or ERP of old, they referred to them as systems of oppression, they captured data,” said Alysa Taylor, Microsoft corporate VP for business applications and industry. “But they didn’t provide any value back to the end user — and what that end user really needs is a system of empowerment, not oppression.”

It’s no secret that few people love their CRM systems (except for maybe a handful of Dreamforce attendees), but ‘system of oppression’ is far from the ideal choice of words here. Yet Taylor is right that early systems often kept data siloed. Unsurprisingly, Microsoft argues that Dynamics 365 does not do that, allowing it to now use all of this data to build machine learning-driven experiences for specific tasks.

Dynamics 365 AI for Sales, unsurprisingly, is meant to help sales teams get deeper insights into their prospects using sentiment analysis. That’s obviously among the most basic of machine learning applications these days, but AI for Sales also helps these salespeople understand what actions they should take next and which prospects to prioritize. It’ll also help managers coach their individual sellers on the actions they should take.

Similarly, the Customer Service app focuses on using natural language understanding to understand and predict customer service problems and leverage virtual agents to lower costs. Taylor used this part of the announcement to throw some shade at Microsoft’s competitor Salesforce. “Many, many vendors offer this, but they offer it in a way that is very cumbersome for organizations to adopt,” she said. “Again, it requires a large services engagement, Salesforce partners with IBM Watson to be able to deliver on this. We are now out of the box.”

Finally, Dynamics 365 AI for Market Insights does just what the name implies: it provides teams with data about social sentiment, but this, too, goes a bit deeper. “This allows organizations to harness the vast amounts of social sentiment, be able to analyze it, and then take action on how to use these insights to increase brand loyalty, as well as understand what newsworthy events will help provide different brand affinities across an organization,” Taylor said. So the next time you see a company try to gin up some news, maybe it did so based on recommendations from Office 365 AI for Market Insights.

Sep
18
2018
--

Microsoft is putting HoloLens to work with new Dynamics 365 applications

Microsoft HoloLens mixed reality glasses have always been interesting technology, but it’s never been clear how the company would move from novelty device to actual viable business use cases. Today, it made a move toward the latter, announcing a couple of applications designed to put the HoloLens to work in Dynamics 365, giving it a real business purpose.

Dynamics 365 is Microsoft’s one-stop shop for CRM and ERP, where a company can work on some of its key business software functions including field service in an integrated fashion. The company has been looking at for HoloLens to bring computing power to a group of field workers like repair technicians for whom even a tablet would be awkward because they have to work with both hands free.

For these people, having a fully functioning Windows 10 computer you can wear on your face could be a big advantage and that’s what Microsoft is hoping to provide with HoloLens. The problem was finding use cases where this would make sense. One idea is providing remote assistance for people out in the field to get help from subject experts back at the office, and today the company announced Dynamics 365 Remote Assist.

In this scenario, the worker is wearing a HoloLens either to understand the repair scenario before they go to the site or to get remote help from a subject expert while they are at the site. The expert  can virtually see what the technician is seeing through the HoloLens, and walk them through the repair without leaving the office, even circling parts and providing other annotations in real time.

Microsoft Remote Assist in action with expert walking the technician through the task. Photo: Microsoft

Microsoft is not the first company to create such a solution. ScopeAR announced RemoteAR 4 months ago, a similar product, but Microsoft has the advantage of building it natively into Windows 10 and all that entails including data integration to update the various repositories with information after the repair is complete.

The other business scenario the company is announcing today is called Dynamics 365 Layout. A designer can create a 3D representation of something like a store or factory layout in CAD software, view the design in 3D in HoloLens, and adjust it in real time before the design goes live. As Microsoft’s Lorraine Bardeen, who has the cool title of General Manager for Microsoft Mixed Reality says, instead of creating cardboard mockups and adjusting your 3D CAD drawing on your computer as you find issues in your design, you can put on your HoloLens and make adjustments in a virtual representation of the layout and it adjusts the CAD drawing for you as you make changes.

Laying out the pieces on a factory floor using Dynamics 365 Layout. Photo: Microsoft

Bardeen says the company has worked with customers to find real-world use cases that would save time, effort and money using mixed reality with HoloLens.  They cite companies like Chevron, Ford and ThyssenKrupp Elevators as organizations actively embracing this kind of technology, but it still not clear if HoloLens and mixed reality will become a central component of business in the future. These two solutions GA on October 1st and we will begin the process of finding out.

Sep
12
2018
--

Nvidia launches the Tesla T4, its fastest data center inferencing platform yet

Nvidia today announced its new GPU for machine learning and inferencing in the data center. The new Tesla T4 GPUs (where the ‘T’ stands for Nvidia’s new Turing architecture) are the successors to the current batch of P4 GPUs that virtually every major cloud computing provider now offers. Google, Nvidia said, will be among the first to bring the new T4 GPUs to its Cloud Platform.

Nvidia argues that the T4s are significantly faster than the P4s. For language inferencing, for example, the T4 is 34 times faster than using a CPU and more than 3.5 times faster than the P4. Peak performance for the P4 is 260 TOPS for 4-bit integer operations and 65 TOPS for floating point operations. The T4 sits on a standard low-profile 75 watt PCI-e card.

What’s most important, though, is that Nvidia designed these chips specifically for AI inferencing. “What makes Tesla T4 such an efficient GPU for inferencing is the new Turing tensor core,” said Ian Buck, Nvidia’s VP and GM of its Tesla data center business. “[Nvidia CEO] Jensen [Huang] already talked about the Tensor core and what it can do for gaming and rendering and for AI, but for inferencing — that’s what it’s designed for.” In total, the chip features 320 Turing Tensor cores and 2,560 CUDA cores.

In addition to the new chip, Nvidia is also launching a refresh of its TensorRT software for optimizing deep learning models. This new version also includes the TensorRT inference server, a fully containerized microservice for data center inferencing that plugs seamlessly into an existing Kubernetes infrastructure.

 

 

Sep
10
2018
--

Adobe looks to AI to lift customer experience business

For years, marketers have been trying to optimize the online shopping experience to better understand their customers and deliver more customized interactions that ultimately drive more sales. Artificial intelligence was supposed to accelerate that, and today Adobe announced enhancements to Adobe Target and Adobe Experience Manager that attempt to deliver at least partly on that promise.

Adobe has been trying to lift the enterprise side of its business for some time, and even though they are well on their way to becoming a $10 billion company, the potential for even more revenue from the enterprise side of the business remains tantalizing. They are counting on AI to help push that along.

Adobe’s Loni Stark says companies are looking for more sophisticated solutions around customization and optimization. Part of that involves using Adobe’s intelligence layer, which they call Sensei, to help marketers as they tweak these programs to drive better experiences.

For starters, the company wants to help users choose the best algorithms for any given set of tasks. Adobe is bringing AI in to assist with a tool it released last year called Auto-Target. “One of the challenges marketers face has been which algorithms do you use, and how do you map them to your personalization strategy. We are enabling Adobe Sensei to choose the best algorithm for them.” She says giving them a smart assistant to help choose should make this task much less daunting for marketers.

Adobe is also bringing some smarts to layout design with a new tool called Smart Layouts, first introduced in March at Adobe Summit. The idea here is to deliver the right layout at any given time to allow marketing teams to scale personalization and increase the likelihood of action, which in marketing speak means buying something.

Once again the company is letting AI guide the process to generate different layouts automatically for different segments, depending on visitor behavior at any given moment. That means a retailer should be able to deliver ever more granular pages based on what it knows about visitors as they move through the shopping process. The more customized the experience, the more likely the shopper turns into a buyer.

Adobe is also looking at new delivery channels, particularly voice, as devices like the Amazon Alexa become increasingly popular. As with the web, mobile, print and other delivery approaches, marketers need to be able apply basic tasks like A/B testing on different voices or workflows, and the company is building these into their tools.

All of these new features are part of Adobe’s ongoing attempt to streamline its marketing tools to make life easier for its customers. By using artificial intelligence to help guide the workflow, they hope to drive more revenue from the digital experience side of the house. While these tools should help, Adobe still makes the vast majority of its money from Creative Cloud. The Digital segment still lags at $586 million (up 18 percent YoY) out of total quarterly revenue of 2.20 billion in the most recent report in June.

The company spent a hefty $1.68 billion in May to snag Magento. They are due to report their next quarterly report on September 18th, and it will be interesting to see if the Magento acquisition and increasing use of artificial intelligence can help continue to grow this side of the business.

Sep
06
2018
--

Salesforce updates Sales Cloud ahead of Dreamforce with increased automation

Dreamforce, Salesforce’s massive customer conference is coming later this month to San Francisco, but the news is starting already well ahead of the event. Today, the company announced updates to its core Sales Cloud with an emphasis toward automation and integration.

For starters, the company wants to simplify inside phone sales, giving the team not only a list of calls organized by those most likely to convert, but walking them through a sales process that’s been defined by management according to what they believe to be best practices.

High Velocity Sales is designed to take underlying intelligence from Salesforce Einstein and apply it to the sales process to give sales people the best chance to convert that prospect. That includes defining contact cadence and content. For calls, the content could be as detailed as call scripts with what to say to the prospect. For emails, it could provide key details designed to move the prospect closer to sale and how often to send that next email.

Defining sales cadence workflow in Sales Cloud. Photo: Salesforce

Once the sales teams begins to move that sale towards a close, Salesforce CPQ (configure, price, quote) capabilities come into play. That product has its roots in the company’s SteelBrick acquisition several years ago, and it too gets a shiny new update for Dreamforce this year.

As sales inches toward a win, it typically moves the process to the the proposal stage where pricing and purchases are agreed upon, and if all goes well a contract gets signed. Updates to CPQ are designed to automate this to the extent possible, pulling information from notes and conversations into an automated quote, or relying on the sales person when it gets more complex.

The idea though is to help sales automate the quote and creation of bill once the quote has been accepted to the extent possible, even providing a mechanism for automatic renewal when a subscription is involved.

The last piece involves Pardot Einstein, a sales and marketing tool, designed to help find the best prospects that come through a company’s marketing process. This is also getting some help from the intelligence layer in a couple of ways.

Einstein Campaign Insights looks at the range of marketing campaigns that are coming out of the marketing organization, determining which campaigns are performing — and those that aren’t — and pushing the art of campaign creation using data science to help determine which types of activities are most likely to succeed in helping convert that shopper into a buyer.

The other piece is called Einstein Behavior Score, which again is using the company’s underlying artificial intelligence tooling to analyze buying behavior based on intent. In other words, which people coming through your web site and apps are most likely to actually buy based on their behaviors — pages they visit, items they click and so forth.

Salesforce recognized the power of artificial intelligence to drive a more automated sales process early on, introducing Einstein in 2016. In typical Salesforce fashion, it has built upon that initial announcement and tried to use AI to automate and drive more successful sales.

The core CRM tool that is the center of the Sales Cloud, is simply a system of record of the customers inside any organization, but the company is trying to automate and integrate across its broad family of products whenever possible to make connections between products and services that might be difficult for humans to make on their own.

While it’s easy to get lost in AI marketing hype — and calling their AI layer by the name “Einstein” certainly doesn’t help in that regard — the company is trying to take advantage of the technology to help customers drive more sales faster, which is the goal of any sales team. It will be up to Salesforce’s customers to decide how well it works.

Sep
05
2018
--

Forethought looks to reshape enterprise search with AI

Forethought, a 2018 TechCrunch Disrupt Battlefield participant, has a modern vision for enterprise search that uses AI to surface the content that matters most in the context of work. Its first use case involves customer service, but it has a broader ambition to work across the enterprise.

The startup takes a bit of an unusual approach to search. Instead of a keyword-driven experience we are used to with Google, Forethought uses an information retrieval model driven by artificial intelligence underpinnings that they then embed directly into the workflow, company co-founder and CEO Deon Nicholas told TechCrunch. They have dubbed their answer engine “Agatha.”

Much like any search product, it begins by indexing relevant content. Nicholas says they built the search engine to be able to index millions of documents at scale very quickly. It then uses natural language processing (NLP) and natural language understanding (NLU) to read the documents as a human would.

“We don’t work on keywords. You can ask questions without keywords and using synonyms to help understand what you actually mean, we can actually pull out the correct answer [from the content] and deliver it to you,” he said.

One of first use cases where they are seeing traction in is customer support. “Our AI, Agatha for Support, integrates into a company’s help desk software, either Zendesk, Salesforce Service Cloud, and then we [read] tickets and suggest answers and relevant knowledge base articles to help close tickets more efficiently,” Nicholas explained. He claims their approach has increased agent efficiency by 20-30 percent.

The plan is to eventually expand beyond the initial customer service use case into other areas of the enterprise and follow a similar path of indexing documents and embedding the solution into the tools that people are using to do their jobs.

When they reach beta or general release, they will operate as a cloud service where customers sign up, enter their Zendesk or Salesforce credentials (or whatever other products happen to be supported at that point) and the product begins indexing the content.

The founding team, mostly in their mid-20s, have had a passion for artificial intelligence since high school. In fact, Nicholas built an AI program to read his notes and quiz him on history while still in high school. Later, at the University of Waterloo, he published a paper on machine learning and had internships at Palantir, Facebook and Dropbox. His first job out of school was at Pure Storage. All these positions had a common thread of working with data and AI.

The company launched last year and they debuted Agatha in private beta four months ago. They currently have six companies participating, the first of which has been converted to a paying customer.

They have closed a pre-seed round of funding too, and although they weren’t prepared to share the amount, the investment was led by K9 Ventures. Village Global, Original Capital and other unnamed investors also participated.

Aug
30
2018
--

Amazon is quietly doubling down on cryptographic security

The growth of cloud services — with on-demand access to IT services over the Internet — has become one of the biggest evolutions in enterprise technology, but with it, so has the threat of security breaches and other cybercriminal activity. Now it appears that one of the leading companies in cloud services is looking for more ways to double down and fight the latter. Amazon’s AWS has been working on a range of new cryptographic and AI-based tools to help manage the security around cloud-based enterprise services, and it currently has over 130 vacancies for engineers with cryptography skills to help build and run it all.

One significant part of the work has been within a division of AWS called the Automated Reasoning Group, which focuses on identifying security issues and developing new tools to fix them for AWS and its customers based on automated reasoning, a branch of artificial intelligence that covers both computer science and mathematical logic and is aimed at helping computers automatically reason completely or nearly completely.

In recent times, Amazon has registered two new trademarks, Quivela and SideTrail, both of which have connections to ARG.

Classified in its patent application as “computer software for cryptographic protocol specification and verification,” Quivela also has a Github repository within AWS Labs’ profile that describes it as a “prototype tool for proving the security of cryptographic protocols,” developed by the AWS Automated Reasoning Group. (The ARG also has as part of its mission to share code and ideas with the community.)

SideTrail is not on Github, but Byron Cook, an academic who is the founder and director of the AWS Automated Reasoning Group, has co-authored a research paper called “SideTrail: Verifying the Time Balancing of Cryptosystems.” However, the link to the paper, describing what this is about, is no longer working.

The trademark application for SideTrail includes a long list of potential applications (as trademark applications often do). The general idea is cryptography-based security services. Among them: “Computer software, namely, software for monitoring, identifying, tracking, logging, analyzing, verifying, and profiling the health and security of cryptosystems; network encryption software; computer network security software,” “Providing access to hosted operating systems and computer applications through the Internet,” and a smattering of consulting potential: “Consultation in the field of cloud computing; research and development in the field of security and encryption for cryptosystems; research and development in the field of software; research and development in the field of information technology; computer systems analysis.”

Added to this, in July, a customer of AWS started testing out two other new cryptographic tools developed by the ARG also for improving an organization’s cybersecurity — with the tools originally released the previous August (2017). Tiros and Zelkova, as the two tools are called, are math-based techniques that variously evaluate access control schemes, security configurations and feedback based on different setups to help troubleshoot and prove the effectiveness of security systems across storage (S3) buckets.

Amazon has not trademarked Tiros and Zelkova. A Zelkova trademark, for financial services, appears to be registered as an LLC called “Zelkova Acquisition” in Las Vegas, while there is no active trademark listed for Tiros.

Amazon declined to respond to our questions about the trademarks. A selection of people we contacted associated with the projects did not respond to requests for comment.

More generally, cryptography is a central part of how IT services are secured: Amazon’s Automated Reasoning Group has been around since 2014 working in this area. But Amazon appears to be doing more now both to ramp up the tools it produces and consider how it can be applied across the wider business. A quick look on open vacancies at the company shows that there are currently 132 openings at Amazon for people with cryptography skills.

“Cloud is the new computer, the Earth is the motherboard and data centers are the cards,” Cook said in a lecture he delivered recently describing AWS and the work that the ARG is doing to help AWS grow. “The challenge is that as [AWS] scales it needs to be ever more secure… How does AWS continue to scale quickly and securely?

“AWS has made a big bet on our community,” he continued, as one answer to that question. That’s led to an expansion of the group’s activities in areas like formal verification and beyond, as a way of working with customers and encouraging them to move more data to the cloud.

Amazon is also making some key acquisitions also to build up its cloud security footprint, such as Sqrrl and Harvest.ai, two AI-based security startups whose founding teams both happen to have worked at the NSA.

Amazon’s AWS division pulled in over $6 billion in revenues last quarter with $1.6 billion in operating income, a healthy margin that underscores the shift that businesses and other organizations are making to cloud-based services.

Security is an essential component of how that business will continue to grow for Amazon and the wider industry: more trust in the infrastructure, and more proofs that cloud architectures can work better than using and scaling the legacy systems that businesses use today, will bolster the business. And it’s also essential, given the rise of breaches and ever more sophisticated cyber crimes. Gartner estimates that cloud-based security services will be a $6.9 billion market this year, rising to nearly $9 billion by 2020.

Automated tools that help human security specialists do their jobs better is an area that others like Microsoft are also eyeing up. Last year, it acquired Israeli security firm Hexadite, which offers remediation services to complement and bolster the work done by enterprise security specialists.

Powered by WordPress | Theme: Aeros 2.0 by TheBuckmaker.com