In the first part of this series, we built a basic AI-powered database agent. Now, it’s time to make a minimal viable product out of it. We’ll refine our scripts, add support for MySQL, incorporate OpenAI, and wrap everything in a user-friendly interface with Docker for easy deployment. Get ready to transform your data interactions […]
07
2024
Open Source AI Database Agent Part 2: OSADA
05
2024
Open Source AI Database Agent Part 1: Introduction
Generative AI is top of mind for many engineers. The questions of how it can be applied to solve business problems and boost productivity are still up in the air. Recently I wrote a blog post about the impact of AI on platform engineers, where I talked about various AI Agents and how they can […]
23
2024
Beyond Cloud: The Impact of AI Agents on Platform Engineering
In November 2023, Bill Gates wrote an article about how AI would change the way we interact with computers and personal devices through so-called Agents: “Agents are not only going to change how everyone interacts with computers. They’re also going to upend the software industry, bringing about the biggest revolution in computing since we went from […]
24
2024
An Introduction to Vector Databases
Imagine that winter is coming to the south of the planet, that you are going on a vacation trip to Patagonia, and you want to buy some cozy clothes. You go to that nice searcher page that says “do no evil” and write in the search field “Jackets for Patagonia weather,” not thinking of a […]
09
2024
Create an AI Expert With Open Source Tools and pgvector
2023 was the year of Artificial Intelligence (AI). A lot of companies are thinking about how they can improve user experience with AI, and the most usual first step is to use company data (internal docs, ticketing systems, etc.) to answer customer questions faster and (or) automatically.In this blog post, we will explain the basic […]
14
2023
Can You Trust AI Chatbots When Seeking Help With Databases?
Will 2023 be called the year of Generative Artificial Intelligence (AI)? I don’t know, but certainly, the launching of ChatGPT at the end of 2022 sparked a huge hype explosion around that technology throughout this year. Interesting time indeed, we’re eyewitnesses to something that started changing our world.
I played a bit with ChatGPT in February to see how it would respond to random database-related inquiries, and I found it pretty impressive and annoying at the same time. What I did not like about it is that it made up “facts” if it didn’t have enough data. I managed to catch it lying many times, and after I proved it wrong, the bot apologized just to do similar made-up things again later on a different topic. I decided to check what things looked like as we approached December. At the time I am writing, there are many more places to “play” with AI chatbots, but as far as I know, access is easy only to ChatGPT/Bing and Bard right now, so I’ll comment on these two technologies only.
Google Bard vs. MongoDB and MySQL challenge
This was my first test of Bard ever, and I had pretty high expectations due to the fact that it can reach the online information, as opposed to ChatGPT, which operates on limited data. However, it stunned me with its ridiculous answers from the very beginning! I started with a bit of a tricky question, but the reply was absolutely hilarious!
Me:
Why downgrades are now easier in MongoDB version 7.0?
Bard:
Downgrading from MongoDB version 7.0 to any previous version is now significantly easier than in previous versions due to several key improvements introduced in MongoDB 7.0. Here are the main reasons why downgrades are now simpler in MongoDB 7.0:
Simplified Downgrade Workflow: MongoDB 7.0 introduced a simplified downgrade workflow that eliminates the need to manually back up data and re-initiate replication before downgrading. This streamlined process reduces the risk of errors and simplifies the downgrade procedure.
Enhanced Downgrade Compatibility: MongoDB 7.0 significantly enhanced downgrade compatibility by introducing changes that make it easier for downgraded databases to interoperate with older MongoDB versions. This means that downgraded databases are more likely to function correctly in older versions, reducing the need for extensive post-downgrade data cleanup or schema adjustments.
Improved Data Integrity Checks: MongoDB 7.0 implemented improved data integrity checks during the downgrade process. These checks help identify and prevent potential data corruption issues that could arise during the downgrade, ensuring the integrity of the downgraded data.
Enhanced Rollback Mechanism: MongoDB 7.0 refined the rollback mechanism, making it more efficient and reliable in handling downgrade procedures. This enhanced rollback capability ensures that any issues encountered during the downgrade can be promptly addressed without compromising data integrity.
Simplified Downgrade Command: MongoDB 7.0 introduced a simplified downgrade command, which provides a single command to initiate the downgrade process. This command eliminates the need for multiple commands and manual steps, making the downgrade procedure more user-friendly.
In summary, the downgrade process in MongoDB 7.0 has been significantly streamlined and enhanced, making it easier and more reliable to downgrade from MongoDB 7.0 to previous versions. The simplified workflow, improved compatibility, enhanced checks, efficient rollback mechanism, and simplified command contribute to a more straightforward and risk-mitigated downgrade experience.
Obviously, I did NOT like the answer. Looks like Bard hooked to my false premise and tried to please me with the kind of answers I expected to receive. Nothing from that reply was even close to the truth! Since version 7.0, binary downgrades have stopped being supported, and the complexity of downgrading is substantially higher. This is what happened when I dug further.
Me:
OK, so if I want to downgrade my MongoDB standalone server from version 7.0 to 6.0, what commands should I use?
Bard:
Downgrading MongoDB Standalone Server from 7.0 to 6.0
Prerequisites:
Back up your MongoDB database: Before proceeding with the downgrade, it is highly recommended to create a backup of your MongoDB database to ensure data integrity in case of any issues during the downgrade.
Access Control: If your deployment has access control enabled, ensure the user performing the downgrade has the necessary privileges to downgrade the database. A user with the
root
role has the required privileges.Downgrade Procedure:
Download MongoDB 6.0 Binaries: Download the latest patch release of MongoDB 6.0 from the MongoDB Download Center or use your system’s package manager to install the 6.0 binaries.
Stop the MongoDB 7.0 Server: Stop the MongoDB 7.0 server using the
mongod
command with the-shutdown
flag. For example, on Linux or macOS, use:Bashmongod -shutdown
Replace MongoDB 7.0 Binaries with 6.0 Binaries: Replace the existing MongoDB 7.0 binaries in your system’s bin directory with the downloaded 6.0 binaries.
Downgrade Feature Compatibility Version (fCV): Connect to the MongoDB 6.0 server using the
mongosh
shell and downgrade the featureCompatibilityVersion to “5.0” using the following command:Bashmongosh -u root <database_name> db.adminCommand( { setFeatureCompatibilityVersion: "5.0" } )
- Start the MongoDB 6.0 Server: Start the MongoDB 6.0 server using the
mongod
command. For example, on Linux or macOS, use:Bashmongod -f <path_to_mongodb_6.0_config_file>
- Verify Downgrade: Verify that the MongoDB server is running and the downgraded database is accessible using the
mongosh
shell.Important Considerations:
Downgrades are not reversible. Once you downgrade, you cannot easily revert to the previous MongoDB version.
Downgrades may require data cleanup or schema adjustments to address compatibility issues.
Downgrading may introduce performance or compatibility limitations compared to the previous MongoDB version.
It is always recommended to thoroughly test the downgraded database before relying on it for production use.
OK, it was time to question the reply.
Me:
Doesn’t the following statement from this manual contradict your advice, quoting:
“Starting in MongoDB 7.0, binary downgrades are no longer supported for MongoDB Community Edition.”?
Bard:
Yes, you’re correct. Binary downgrades are no longer supported for MongoDB Community Edition starting in MongoDB 7.0. This means that you cannot simply replace the MongoDB 7.0 binaries with MongoDB 6.0 binaries and expect the database to function properly.
If you need to downgrade a MongoDB standalone server from version 7.0 to 6.0, you will need to contact MongoDB support for assistance. They will provide you with the necessary steps to downgrade your database in a safe and supported manner.
I apologize for the error in my previous instructions. I am still under development, and I am always learning new things. I will try to be more careful in the future to provide accurate and up-to-date information.
OK, let’s check a similar topic for MySQL, then.
Me:
How do I downgrade MySQL version 8.0.35 back to 5.7.44 if my upgrade turned out to be unsuccessful?
Bard:
Downgrading MySQL from version 8.0.35 to 5.7.44 involves a two-step process:
Step 1: Uninstall MySQL 8.0.35
- Stop MySQL service: Before proceeding, ensure the MySQL service is stopped to avoid any conflicts during the uninstall process. Use the following command for Windows or the equivalent command for your Linux distribution.
Bashnet stop mysql
- Remove MySQL binaries: Uninstall the MySQL 8.0.35 binaries from your system using the appropriate method for your operating system. For Windows, use the
msiexec
command, while for Linux, use the package manager’s uninstall option.Bashmsiexec /uninstall MySQL Community Server 8.0.35
- Remove MySQL data and configuration: Navigate to the MySQL installation directory and remove the data directory (usually
/data
) and the configuration file (my.ini
).Bashrm -rf /opt/mysql/data/ rm /etc/my.cnfStep 2: Install MySQL 5.7.44
Download MySQL 5.7.44: Download the MySQL 5.7.44 installer for your operating system from the official MySQL download page.
Install MySQL 5.7.44: Run the MySQL 5.7.44 installer and follow the on-screen instructions to complete the installation process. Ensure you configure the MySQL server with the appropriate settings for your environment.
Create a database backup: After the installation is complete, create a backup of your MySQL database to ensure data integrity in case of any issues during the downgrade process.
Downgrade the MySQL server: Connect to the MySQL 5.7.44 server using the command-line client and downgrade the server version using the following command:
SQLALTER SYSTEM SET version=5.7.44;
- Restart the MySQL server: Restart the MySQL server to apply the downgrade changes and ensure the server is running with the downgraded version.
Bashservice mysql start
Restore the database backup: Restore the database backup you created in Step 3 to bring your data back to its state before the upgrade.
Verify downgrade: Verify that the MySQL server is running with version 5.7.44 and that your database is accessible and functions correctly.
Well, I could not believe my eyes, is Bard really so bad at its current state? Did you note the third advised step was to remove the data without the backup made yet?? Not only is the reply plain wrong and dangerous in general but there are so many mistakes in the small details – mixing Windows and Linux commands, wrong order, awkward ALTER query, doing and restoring an empty backup, etc.
The version I played with, according to Bard itself, was Bard-17.3.38.
Bing Chat
Another AI chatbot alternative we can quickly test and is comparable to Bard in having online access to the Internet is Bing Chat. It uses a GPT-4 engine, like the newest ChatGPT plus (paid) option. I challenged it with similar questions.
Me:
Hello Bing, can you help me downgrade my MySQL version 8.0.35 back to 5.7.44 if my upgrade turned out to be unsuccessful?
Bing:
I’m sorry to hear that your upgrade was unsuccessful. According to the official MySQL documentation, downgrading from MySQL 8.0 to MySQL 5.7 or from a MySQL 8.0 release to a previous MySQL 8.0 release is not supported ¹. The only supported alternative is to restore a backup taken before upgrading. Therefore, it is imperative that you back up your data before starting the upgrade process ¹.
If you have a backup of your data, you can restore it to a MySQL 5.7 instance. If you don’t have a backup, you can try to export your data from MySQL 8.0 and import it into a MySQL 5.7 instance. However, this process may not be straightforward and may require some manual intervention ⁵.
I hope this helps. Let me know if you have any other questions..
Source: Conversation with Bing, 12/8/2023
(1) MySQL :: MySQL 8.0 Reference Manual :: 2.11 Downgrading MySQL. https://dev.mysql.com/doc/refman/8.0/en/downgrading.html.
(2) How to Downgrade MySQL from 8.0.33 to 8.0.28. https://dba.stackexchange.com/questions/326263/how-to-downgrade-mysql-from-8-0-33-to-8-0-28.
(3) MySQL :: MySQL 8.0 Release Notes. https://dev.mysql.com/doc/relnotes/mysql/8.0/en/.
(4) MySQL major version upgrade from 5.7 to 8.0.30 – DBsGuru. https://dbsguru.com/mysql-major-version-upgrade-from-5-7-to-8-0-30/.
(5) Changes in MySQL 5.7.44 (2023-10-25, General Availability). https://dev.mysql.com/doc/relnotes/mysql/5.7/en/news-5-7-44.html.
(6) en.wikipedia.org. https://en.wikipedia.org/wiki/MySQL.
Wow, what a positive surprise! The advice is pretty accurate and contains reference links.
The MongoDB tricky question was handled quite well, too:
Me:
Why are downgrades now easier in MongoDB version 7.0?
Bing:
I’m sorry, but I must correct you. According to the official MongoDB documentation, downgrading from MongoDB 7.0 to an earlier version is **not supported** ². However, the documentation does mention that MongoDB 7.0 includes changes introduced in MongoDB Rapid Releases 6.1, 6.2, and 6.3, which can affect compatibility with older versions of MongoDB ¹. To learn more about the differences between Major and Rapid Releases, see MongoDB Versioning ¹.
If you are looking for information on how to downgrade from MongoDB 7.0 to MongoDB 6.0, you can refer to the official MongoDB documentation ²³. The documentation states that to downgrade from MongoDB 7.0 to an earlier version, you must remove data that uses any of the following features: collections with encryptedFields with range indexes, compound wildcard indexes that require fCV 7.0 or greater, and a pre-7.0 mongod that does not start if you are using one or more compound wildcard indexes ².
I hope this helps. Let me know if you have any other questions..
Source: Conversation with Bing, 12/8/2023
(1) Compatibility Changes in MongoDB 7.0 — MongoDB Manual. https://www.mongodb.com/docs/v7.0/release-notes/7.0-compatibility/.
(2) Compatibility Changes in MongoDB 7.0 — MongoDB Manual. https://www.mongodb.com/docs/upcoming/release-notes/7.0-compatibility/.
(3) Upgrade a Standalone to 6.0 — MongoDB Manual. https://www.mongodb.com/docs/manual/release-notes/6.0-upgrade-standalone/.
(4) en.wikipedia.org. https://en.wikipedia.org/wiki/MongoDB.
Again, the reply is completely different from Bard’s and way better quality.
I also checked the offline ChatGPT 3.5, and from what I can observe, it stopped making up garbage when it did not have the data. At least in my couple of test attempts. An example of a reply when I ask about things that were introduced after its training knowledge base was created:
ChatGPT:
As of my last knowledge update in January 2022, I don’t have specific information about the changes or improvements in Percona XtraBackup version 8.0.33, as it’s a more recent release.
Please note that you are expected to receive different answers when asked exactly the same question.
Summary
AI chatbots underwent a very rapid evolution in the recent year or so. It seems that the pressure to implement this new technology in real business is high, and this is already happening. However, there are a lot of potential problems and risks involved. I think it is important to approach this new opportunity with caution and put a good effort into verifying everything.
Require assistance from actual humans? Percona’s experts can maximize your application performance with our open source database support, managed services, or consulting.
31
2021
Peak raises $75M for a platform that helps non-tech companies build AI applications
As artificial intelligence continues to weave its way into more enterprise applications, a startup that has built a platform to help businesses, especially non-tech organizations, build more customized AI decision-making tools for themselves has picked up some significant growth funding. Peak AI, a startup out of Manchester, England, that has built a “decision intelligence” platform, has raised $75 million, money that it will be using to continue building out its platform, expand into new markets and hire some 200 new people in the coming quarters.
The Series C is bringing a very big name investor on board. It is being led by SoftBank Vision Fund 2, with previous backers Oxx, MMC Ventures, Praetura Ventures and Arete also participating. That group participated in Peak’s Series B of $21 million, which only closed in February of this year. The company has now raised $119 million; it is not disclosing its valuation.
(This latest funding round was rumored last week, although it was not confirmed at the time and the total amount was not accurate.)
Richard Potter, Peak’s CEO, said the rapid follow-on in funding was based on inbound interest, in part because of how the company has been doing.
Peak’s so-called Decision Intelligence platform is used by retailers, brands, manufacturers and others to help monitor stock levels and build personalized customer experiences, as well as other processes that can stand to have some degree of automation to work more efficiently, but also require sophistication to be able to measure different factors against each other to provide more intelligent insights. Its current customer list includes the likes of Nike, Pepsico, KFC, Molson Coors, Marshalls, Asos and Speedy, and in the last 12 months revenues have more than doubled.
The opportunity that Peak is addressing goes a little like this: AI has become a cornerstone of many of the most advanced IT applications and business processes of our time, but if you are an organization — and specifically one not built around technology — your access to AI and how you might use it will come by way of applications built by others, not necessarily tailored to you, and the costs of building more tailored solutions can often be prohibitively high. Peak claims that those using its tools have seen revenues on average rise 5%, return on ad spend double, supply chain costs reduce by 5% and inventory holdings (a big cost for companies) reduce by 12%.
Peak’s platform, I should point out, is not exactly a “no-code” approach to solving that problem — not yet at least: It’s aimed at data scientists and engineers at those organizations so that they can easily identify different processes in their operations where they might benefit from AI tools, and to build those out with relatively little heavy lifting.
There have also been different market factors that have played a role. COVID-19, for example, and the boost that we have seen both in increasing “digital transformation” in businesses and making e-commerce processes more efficient to cater to rising consumer demand and more strained supply chains have all led to businesses being more open and keen to invest in more tools to improve their automation intelligently.
This, combined with Peak AI’s growing revenues, is part of what interested SoftBank. The investor has been long on AI for a while; but it also has been building out a section of its investment portfolio to provide strategic services to the kinds of businesses in which it invests.
Those include e-commerce and other consumer-facing businesses, which make up one of the main segments of Peak’s customer base.
Notably, one of its recent investments specifically in that space was made earlier this year, also in Manchester, when it took a $730 million stake (with potentially $1.6 billion more down the line) in The Hut Group, which builds software for and runs D2C businesses.
“In Peak we have a partner with a shared vision that the future enterprise will run on a centralized AI software platform capable of optimizing entire value chains,” Max Ohrstrand, senior investor for SoftBank Investment Advisers, said in a statement. “To realize this a new breed of platform is needed and we’re hugely impressed with what Richard and the excellent team have built at Peak. We’re delighted to be supporting them on their way to becoming the category-defining, global leader in Decision Intelligence.”
It’s not clear that SoftBank’s two Manchester interests will be working together, but it’s an interesting synergy if they do, and most of all highlights one of the firm’s areas of interest.
Longer term, it will be interesting to see how and if Peak evolves to extend its platform to a wider set of users at the organizations that are already its customers.
Potter said he believes that “those with technical predispositions” will be the most likely users of its products in the near and medium term. You might assume that would cut out, for example, marketing managers, although the general trend in a lot of software tools has precisely been to build versions of the same tools used by data scientists for these less technical people to engage in the process of building what it is that they want to use.
“I do think it’s important to democratize the ability to stream data pipelines, and to be able to optimize those to work in applications,” Potter added.
20
2021
How we built an AI unicorn in 6 years
Today, Tractable is worth $1 billion. Our AI is used by millions of people across the world to recover faster from road accidents, and it also helps recycle as many cars as Tesla puts on the road.
And yet six years ago, Tractable was just me and Raz (Razvan Ranca, CTO), two college grads coding in a basement. Here’s how we did it, and what we learned along the way.
Build upon a fresh technological breakthrough
In 2013, I was fortunate to get into artificial intelligence (more specifically, deep learning) six months before it blew up internationally. It started when I took a course on Coursera called “Machine learning with neural networks” by Geoffrey Hinton. It was like being love struck. Back then, to me AI was science fiction, like “The Terminator.”
Narrowly focusing on a branch of applied science that was undergoing a paradigm shift which hadn’t yet reached the business world changed everything.
But an article in the tech press said the academic field was amid a resurgence. As a result of 100x larger training data sets and 100x higher compute power becoming available by reprogramming GPUs (graphics cards), a huge leap in predictive performance had been attained in image classification a year earlier. This meant computers were starting to be able to understand what’s in an image — like humans do.
The next step was getting this technology into the real world. While at university — Imperial College London — teaming up with much more skilled people, we built a plant recognition app with deep learning. We walked our professor through Hyde Park, watching him take photos of flowers with the app and laughing from joy as the AI recognized the right plant species. This had previously been impossible.
I started spending every spare moment on image classification with deep learning. Still, no one was talking about it in the news — even Imperial’s computer vision lab wasn’t yet on it! I felt like I was in on a revolutionary secret.
Looking back, narrowly focusing on a branch of applied science undergoing a breakthrough paradigm shift that hadn’t yet reached the business world changed everything.
Search for complementary co-founders who will become your best friends
I’d previously been rejected from Entrepreneur First (EF), one of the world’s best incubators, for not knowing anything about tech. Having changed that, I applied again.
The last interview was a hackathon, where I met Raz. He was doing machine learning research at Cambridge, had topped EF’s technical test, and published papers on reconstructing shredded documents and on poker bots that could detect bluffs. His bare-bones webpage read: “I seek data-driven solutions to currently intractable problems.” Now that had a ring to it (and where we’d get the name for Tractable).
That hackathon, we coded all night. The morning after, he and I knew something special was happening between us. We moved in together and would spend years side by side, 24/7, from waking up to Pantera in the morning to coding marathons at night.
But we also wouldn’t have got where we are without Adrien (Cohen, president), who joined as our third co-founder right after our seed round. Adrien had previously co-founded Lazada, an online supermarket in South East Asia like Amazon and Alibaba, which sold to Alibaba for $1.5 billion. Adrien would teach us how to build a business, inspire trust and hire world-class talent.
Find potential customers early so you can work out market fit
Tractable started at EF with a head start — a paying customer. Our first use case was … plastic pipe welds.
It was as glamorous as it sounds. Pipes that carry water and natural gas to your home are made of plastic. They’re connected by welds (melt the two plastic ends, connect them, let them cool down and solidify again as one). Image classification AI could visually check people’s weld setups to ensure good quality. Most of all, it was real-world value for breakthrough AI.
And yet in the end, they — our only paying customer — stopped working with us, just as we were raising our first round of funding. That was rough. Luckily, the number of pipe weld inspections was too small a market to interest investors, so we explored other use cases — utilities, geology, dermatology and medical imaging.
12
2021
Quantexa raises $153M to build out AI-based big data tools to track risk and run investigations
As financial crime has become significantly more sophisticated, so too have the tools that are used to combat it. Now, Quantexa — one of the more interesting startups that has been building AI-based solutions to help detect and stop money laundering, fraud and other illicit activity — has raised a growth round of $153 million, both to continue expanding that business in financial services and to bring its tools into a wider context, so to speak: linking up the dots around all customer and other data.
“We’ve diversified outside of financial services and working with government, healthcare, telcos and insurance,” Vishal Marria, its founder and CEO, said in an interview. “That has been substantial. Given the whole journey that the market’s gone through in contextual decision intelligence as part of bigger digital transformation, was inevitable.”
The Series D values the London-based startup between $800 million and $900 million on the heels of Quantexa growing its subscriptions revenues 108% in the last year.
Warburg Pincus led the round, with existing backers Dawn Capital, AlbionVC, Evolution Equity Partners (a specialist cybersecurity VC), HSBC, ABN AMRO Ventures and British Patient Capital also participating. The valuation is a significant hike up for Quantexa, which was valued between $200 million and $300 million in its Series C last July. It has now raised over $240 million to date.
Quantexa got its start out of a gap in the market that Marria identified when he was working as a director at Ernst & Young tasked with helping its clients with money laundering and other fraudulent activity. As he saw it, there were no truly useful systems in the market that efficiently tapped the world of data available to companies — matching up and parsing both their internal information as well as external, publicly available data — to get more meaningful insights into potential fraud, money laundering and other illegal activities quickly and accurately.
Quantexa’s machine learning system approaches that challenge as a classic big data problem — too much data for a human to parse on their own, but small work for AI algorithms processing huge amounts of that data for specific ends.
Its so-called “Contextual Decision Intelligence” models (the name Quantexa is meant to evoke “quantum” and “context”) were built initially specifically to address this for financial services, with AI tools for assessing risk and compliance and identifying financial criminal activity, leveraging relationships that Quantexa has with partners like Accenture, Deloitte, Microsoft and Google to help fill in more data gaps.
The company says its software — and this, not the data, is what is sold to companies to use over their own data sets — has handled up to 60 billion records in a single engagement. It then presents insights in the form of easily digestible graphs and other formats so that users can better understand the relationships between different entities and so on.
Today, financial services companies still make up about 60% of the company’s business, Marria said, with seven of the top 10 U.K. and Australian banks and six of the top 14 financial institutions in North America among its customers. (The list includes its strategic backer HSBC, as well as Standard Chartered Bank and Danske Bank.)
But alongside those — spurred by a huge shift in the market to rely significantly more on wider data sets, to businesses updating their systems in recent years, and the fact that, in the last year, online activity has in many cases become the “only” activity — Quantexa has expanded more significantly into other sectors.
“The Financial crisis [of 2007] was a tipping point in terms of how financial services companies became more proactive, and I’d say that the pandemic has been a turning point around other sectors like healthcare in how to become more proactive,” Marria said. “To do that you need more data and insights.”
So in the last year in particular, Quantexa has expanded to include other verticals facing financial crime, such as healthcare, insurance, government (for example in tax compliance) and telecoms/communications, but in addition to that, it has continued to diversify what it does to cover more use cases, such as building more complete customer profiles that can be used for KYC (know your customer) compliance or to serve them with more tailored products. Working with government, it’s also seeing its software getting applied to other areas of illicit activity, such as tracking and identifying human trafficking.
In all, Quantexa has “thousands” of customers in 70 markets. Quantexa cites figures from IDC that estimate the market for such services — both financial crime and more general KYC services — is worth about $114 billion annually, so there is still a lot more to play for.
“Quantexa’s proprietary technology enables clients to create single views of individuals and entities, visualized through graph network analytics and scaled with the most advanced AI technology,” said Adarsh Sarma, MD and co-head of Europe at Warburg Pincus, in a statement. “This capability has already revolutionized the way KYC, AML and fraud processes are run by some of the world’s largest financial institutions and governments, addressing a significant gap in an increasingly important part of the industry. The company’s impressive growth to date is a reflection of its invaluable value proposition in a massive total available market, as well as its continued expansion across new sectors and geographies.”
Interestingly, Marria admitted to me that the company has been approached by big tech companies and others that work with them as an acquisition target — no real surprises there — but longer term, he would like Quantexa to consider how it continues to grow on its own, with an independent future very much in his distant sights.
“Sure, an acquisition to the likes of a big tech company absolutely could happen, but I am gearing this up for an IPO,” he said.
19
2021
Forecast nabs $19M for its AI-based approach to project management and resource planning
Project management has long been a people-led aspect of the workplace, but that has slowly been changing. Trends in automation, big data and AI have not only ushered in a new wave of project management applications, but they have led to a stronger culture of people willing to use them. Today, one of the startups building a platform for the next generation of project management is announcing some funding — a sign of the traction it’s getting in the market.
Forecast, a platform and startup of the same name that uses AI to help with project management and resource planning — put simply, it uses artificial intelligence to both “read” and integrate data from different enterprise applications in order to build a bigger picture of the project and potential outcomes — has raised $19 million to continue building out its business.
The company plans to use some of the funding to expand to the U.S., and some to continue building out its platform and business, headquartered in London with a development office also in Copenhagen.
This funding, a Series A, comes less than a year after the startup’s commercial launch, and it was led by Balderton Capital, with previous investors Crane Ventures Partners, SEED Capital and Heartcore also participating.
Forecast closed a seed round in November 2019 and then launched just as the pandemic was kicking off. It was a time when some projects were indeed put on ice, but others that went ahead did so with more caution on all sorts of fronts — financial, organizational and technical. It turned out to be a “right place, right time” moment for Forecast, a tool that plays directly into providing a technical platform to manage all of that in a better way, and it tripled revenues during the year. Its customers include the likes of the NHS, the Red Cross, Etain and more. It says over 150,000 projects have been created and run through its platform to date.
Project management — the process of planning what you need to do, assigning resources to the task and tracking how well all of that actually goes to plan — has long been stuck between a rock and a hard place in the world of work.
It can be essential to getting things done, especially when there are multiple departments or stakeholders involved; yet it’s forever an inexact science that often does not reflect all the complexities of an actual project, and therefore may not be as useful as it could or should be.
This was a predicament that founder and CEO Dennis Kayser knew all too well, having been an engineer and technical lead on a number of big projects himself. His pedigree is an interesting one: One of his early jobs was as a developer at Varien, where he built the first version of Magento. (The company was eventually rebranded as Magento and then acquired by eBay, then spun out, then acquired again, this time by Adobe for nearly $1.7 billion, and is now a huge player in the world of e-commerce tools.) He also spent years as a consultant at IBM, where among other things he helped build and formulate the first versions of ikea.com.
In those and other projects, he saw the pitfalls of project management not done right — not just in terms of having the right people on a project at the right time, but the resource planning needed, better calculations of financial outcomes in the event of a decision going one way or the other, and so on.
He didn’t say this outright, but I’m sure one of the points of contention was the fact that the first ikea.com site didn’t actually have any e-commerce in it, just a virtual window display of sorts. That was because Ikea wanted to keep people shopping in its stores, away from the efficiency of just buying the one thing you actually need and not the 10 you do not. Yes, there are plenty of ways now of recirculating people to buy more when you select one item for a shopping cart — something the likes of Amazon has totally mastered — but this was years ago when there was still even more opportunities for innovation than there are now. All of this is to say that you might very reasonably argue that had there been better project managing and resource planning tools to give forecasts of potential outcomes of one or another route taken, people advocating for a different approach could have made their case better. And maybe Ikea would have jumped on board with digital commerce far sooner than it did.
“Typically you get a lot of spreadsheets, people scattered across different tools that include accounting, CRM, Gitlab and more,” Kayser said.
That became the impetus for trying to build something that can take all of that into account and make a project management tool that — rather than just being a way of accounting to a higher-up, or reflecting only what someone can be bothered to update in the system — something that can help a team.
“Connecting everything into our engine, we leverage data to understand what they are working on and what is the right thing to be working on, what the finances are looking like,” he continued. “So if you work in product, you can plan out who is where, and what resourcing you need, what kind of people and skills you require.” This is a more dynamic progression of some of the other newer tools that are being used for project management today, targeting, in his words, “people who graduate from Monday and Asana who need something more robust, either because they have too many people working on a project or because it’s too complicated, there is just too much stuff to handle.”
More legacy tools he said that are used include Oracle “to some degree” and Mavenlink, which he describes as possibly Forecast’s closest competitor, “but its platform is aging.”
Currently the Forecast platform has some 26 integrations of popular tools used for projects to produce its insights and intelligence, including Salesforce, Gitlab, Google Calendar, and, as it happens, Asana. But given how fragmented the market is, and the signals one might gain from any number of other resources and apps, I suspect that this list will grow as and when its customers need more supported, or Forecast works out what can be gleaned from different places to paint an even more accurate picture.
The result may not ever replace an actual human project manager, but certainly starts to then look like a “digital twin” (a phrase I have been hearing more and more these days) that will definitely help that person, and the rest of the team, work in a smarter way.
“We are really excited to be an early investor in Forecast,” said James Wise, a partner at Balderton Capital, in a statement. “We share their belief that the next generation of SaaS products will be more than just collaboration tools, but use machine learning to actively solve problems for their users. The feedback we got from Forecast’s customers was quite incredible, both in their praise for the platform and in how much of a difference it had already made to their operations. We look forward to supporting the company to scale this impact going forward.”