Skip to main content

10 Ways To Prevent Shadow AI Disaster

By Article

As contributing writer Mary Pratt, shares in her CIO article, just as it was with shadow IT of yore, there’s no one-and-done solution that can prevent the use of unsanctioned AI technologies or the possible consequences of their use.

However, CIOs can adopt various strategies to help eliminate the use of unsanctioned AI, prevent disasters, and limit the blast radius if something does go awry. Here, IT leaders share 10 ways that CIO can do so.

Unsanctioned AI in the workplace is putting company data, systems, and business relationships at risk.

Here’s are 10 ways to pivot employees’ AI curiosity toward acceptable use — and organizational value.


    A big first step is working with other executives to create an acceptable use policy that outlines when, where, and how AI can be used and reiterating the organization’s overall prohibitions against using tech that has not been approved by IT, says David Kuo, executive director of privacy compliance at Wells Fargo and a member of the Emerging Trends Working Group at the nonprofit governance association ISACA. Sounds obvious but most organizations don’t yet have one. 

    Kuo acknowledges the limits of Step 1: “You can set an acceptable use policy but people are going to break the rules.” So warn them about what can happen.

    “There has to be more awareness across the organization about the risks of AI, and CIOs need to be more proactive about explaining the risks and spreading awareness about them across the organization,” says Sreekanth Menon, global leader for AI/ML services at Genpact, a global professional services and solutions firm. Outline the risks associated with AI in general as well as the heightened risks that come with the unsanctioned use of the technology.

    Kuo adds: “It can’t be one-time training, and it can’t just say ‘Don’t do this.’ You have to educate your workforce. Tell them the problems that you might have with [shadow AI] and the consequences of their bad behavior.”


    Although AI adoption is rapidly rising, research shows that confidence in harnessing the power of intelligent technologies has gone down among executives, says Fawad Bajwa, global AI practice leader at Russell Reynolds Associates, a leadership advisory firm. Bajwa believes the decline is due in part to a mismatch between expectations for AI and what it actually can deliver.

    He advises CIOs to educate on where, when, how, and to what extent AI can deliver value.

    “Having that alignment across the organization on what you want to achieve will allow you to calibrate the confidence,” he says. That in turn could keep workers from chasing AI solutions on their own in the hopes of finding a panacea to all their problems.


    One of the biggest risks around AI is data leakage, says Krishna Prasad, chief strategy officer and CIO at UST, a digital transformation solutions company.

    Sure, that risk exists with planned AI deployments, but in those cases CIOs can work with business, data and security colleagues to mitigate risks. But they don’t have the same risk review and mitigation opportunities when workers deploy AI without their involvement, thereby upping the chances that sensitive data could be exposed.

    To help head off such scenarios, Prasad advises tech, data, and security teams to review their data access policies and controls as well as their overall data loss prevention program and data monitoring capabilities to ensure they’re robust enough to prevent leakage with unsanctioned AI deployments.


    Another step that can help, Kuo says: blacklisting AI tools, such as OpenAI’s ChatGPT, and use firewall rules to prevent employees from using company systems to access. Have a firewall rule to prevent those tools from being accessed by company systems.


    CIOs shouldn’t be the only ones working to prevent shadow AI, Kuo says. They should be enlisting their C-suite colleagues — who all have a stake in protecting the organization against any negative consequences — and get them onboard with educating their staffers on the risks of using AI tools that go against official IT procurement and AI use policies.

    “Better protection takes a village,” Kuo adds.


    Employees typically bring in technologies that they think can help them do their jobs better, not because they’re trying to hurt their employers. So CIOs can reduce the demand for unsanctioned AI by delivering the AI capabilities that best help workers achieve the priorities set for their roles.

    Bajwa says CIOs should see this as an opportunity to lead their organizations into future successes by devising AI roadmaps that not only align to business priorities but actually shape strategies. “This is a business redefining moment,” Bajwa says.


    Executive advisers say CIOs (and their C-suite colleagues) can’t drag their feet on AI adoption because it hurts the organization’s competitiveness and ups the chances of shadow AI. Yet that’s happening to some degree in many places, according to Genpact and HFS Research. Their May 2024 report revealed that 45% of organizations have adopted a “wait and watch” stance on genAI and 23% are “deniers” who are skeptical of genA.


    ISACA’s March survey found that 80% believe many jobs will be modified because of AI. If that’s the case, give workers the tools to use AI to make the modifications that will improve their jobs, says Beatriz Sanz Sáiz, global data and AI leader at EY Consulting.

    She advises CIOs to give workers throughout their organizations (not just in IT) the tools and training to create or co-create with IT some of their own intelligent assistants. She also advises CIOs to build a flexible technology stack so they can quickly support and enable such efforts as well as pivot to new large language models (LLMs) and other intelligent components as worker demands arise — thereby making employees more likely to turn to IT (rather than external sources) to build solutions.


    AI isn’t new, but the quickly escalating rate of adoption is showing more of its problems and potentials. CIOs who want to help their organizations harness the potentials (without all the problems) should be open-minded about new ways of using AI so employees don’t feel they need to go it alone.

    Bajwa offers an example around AI hallucinations: Yes, hallucinations have gotten a nearly universal bad rap, but Bajwa points out that hallucinations could be useful in creative spaces such as marketing.

    “Hallucinations can come up with ideas that none of us have thought about before,” he says.

Thanks to  Mary K. Pratt, contributing writer at CIO for this article and information. The full article can be read, here.

Read More

M.R. Asks 3 Questions: Brett Shively, CEO of ACI Learning

By Article

Since 2019, CEO Brett Shively has led ACI, which provides audit, cyber, and IT training to more than 250,000 subscribers worldwide with an 80% content completion rate.

With robust leadership roles at OnCourse Learning, Everspring, DeVry, and Kaplan, Brett brings a wealth of experience to his role as CEO of ACI Learning and, his understanding of the training/L&D landscape and its future direction is deeply respected in the field. 

We hope you enjoy this quick and thoughtful conversation.

M.R. Rangaswami: With so many competing business priorities, why should organizations invest in training and development?

Brett Shively: Today’s business landscape is rife with change. Budgets are tight, competition is fierce, the global talent shortage persists. Furthermore, new technologies like generative AI are shaking things up, creating both exciting opportunities and new skills gaps. 

By implementing an effective training strategy, leaders can meet this moment head on and empower their workforce to adapt. Many companies recognize the vital role employee learning plays in the success of their business, which is why the learning management system (LMS) market is expected to reach $51.9 billion USD by 2028.

The landscape is really changing so quickly, both on the technology side with AI, cybersecurity and other factors, and on the regulatory side with auditing. Companies that actively support continual learning rather than just leaving it to the employee get better outcomes.

M.R. What methods work best to upskill employees in our modern workforce?

Brett: There are a few simple steps leaders can take to start.

First, use assessment tools to uncover employees’ needs and motivations. It’s impossible to help someone if you don’t know what they need. And too often, companies apply a one-size-fits-all approach to employee learning, which can lead to redundancies, disengaged employees, and workers that are either overqualified or underqualified for their role. Assessment tools help businesses identify opportunities for upskilling or reskilling, measure learning progress, track the application of new skills in the workplace, and guide individuals along their career paths.

Then, using what was learned from the assessments, companies should tailor trainings to the specific groups of employees they’re educating — even down to the individual level if necessary. For example, learning might be tailored to generational differences: Gen Z employees might get the most value from self-led programs that feature “bingeable” snippets of content — think short-form, Tik Tok-esque videos. Conversely, millennials may prefer more structured, long-form, instructor-led content. In other instances, training might need to be tailored for accessibility, neurodiversity, or some other factor.

Finally, don’t underestimate the power of incentivization. Many people are already overloaded in their daily roles, and asking them to learn something new — while exciting — can also seem overwhelming. Incentivization is a simple yet effective way to motivate employees and reignite their spark for learning.


M.R. How does AI come into play with the future of learning and development?

Brett: AI is a great human intelligence booster. It’s like having a calculator to help you with complex math problems – 50 years ago using a calculator was considered “cheating,” but now they’re universally recognized as learning tools that enable students to focus on the key problem (unless they’re trying to learn how to do long division!). Generative AI can be viewed through the same lens – it helps people get a kickstart on problems, it helps them summarize and organize a large body of information quickly and it can generate creative solutions that can be refined.

In addition, AI-infused tools can help organizations take their training efforts to the next level through personalized learning, deeper engagement, boosted efficiency, and inclusive access. AI can also tailor learning experiences to employees’ individual needs by automatically adjusting difficulty levels, recommending helpful resources, and providing customized feedback.

The potential benefits of AI are vast, but organizations must be mindful regarding adoption—with so many different solutions available, companies risk overwhelming employees with new information and capabilities, inadvertently diminishing the value they’re aiming to provide. The key is to start slow and offer plenty of support as employees familiarize themselves with these new tools.

The pitfalls are well known – when using Generative AI to answer critical questions – checking the answers is equally critical. Additionally, simply asking Generative AI to create something and using that output without any refinement can result in generic, unhelpful answers that are easily spotted as the work of an LLM model.

M.R. Rangaswami is the Co-Founder of

Read More

SEG’s 2024 SaaS M&A Public Market Report: Q1 

By Article

Software Equity Group’s Q1 M&A Report is in, revealing that the start of the year brings strong software deal volume, totalling 823 deals in 1Q24.

Despite a recent decline in SaaS M&A activity over the past two quarters, the 486 SaaS deals in 1Q24 far surpass pre-2022 levels, marking a 50% increase from the Q1 average of 2019-2021.

Here are 5 Highlights from SEG’s: 1Q24 SaaS Report:

  1. AGGREGATE SOFTWARE INDUSTRY M&A DEAL VOLUME HAS SETTLED into a steady state that remains strong relative to historical standards, recording 823 total deals in 1Q24, a similar level to 1Q23. The trend shows a steady increase compared to pre-COVID levels, with ‘21 and ‘22 representing a period of
    unprecedented M&A volume.

  2. DEAL ACTIVITY FOR SAAS M&A HAS SEEN LESS VOLUME IN THE LAST TWO QUARTERS. This trend is likely caused by a myriad of compounding variables, including regional bank instability in early 2023, macroeconomic concerns, and a high-interest rate environment that have all had lagging impacts on volume.

    With the Fed planning to cut interest rates three times this year, deal volume is expected to increase. Despite a few lighter volume quarters, the 486 deals SaaS deals in 1Q24 remain well above pre-2022 levels, up 50% from the 2019-2021 Q1 average.

  3. THE MEDIAN EV/TTM REVENUE MULTIPE FOR 1Q24 WAS 3.8X in line with 4Q23. Over the last three quarters, both the median and average EV/TTM revenue multiples for SaaS M&A have stabilized, indicating that the M&A market has fully responded to a higher interest rate environment and caught up to the public SaaS market.

    M&A valuations for the remainder of 2024 will be impacted by the Fed’s interest rate-cutting trajectory, but in the interim, highperforming businesses continue to receive strong outcomes in the M&A markets.

  4. VERTICAL SAAS COMPRISED 49% OF ALL SAAS M&A DEALS in 1Q24, continuing the trend of buyers and investors seeking the types of purpose-built, mission critical applications that are the calling card of vertical software companies.

    Healthcare and Financial Services represented the most active verticals this quarter. However, several other verticals, including Hospitality, Retail, and Manufacturing, saw increased activity YOY, indicating that the buyers and investors are not just looking for deals in historically active verticals but rather widening their focus to comprise a variety of verticals.

  5. PRIVATE EQUITY CONTINUED TO PACE SAAS M&A ACTIVITY (59% of 1Q24 deals), driven by PE-backed strategics (48%) that continue to leverage an ideal mix of product synergies and capital allocated to M&A. Strategic buyers (40.7% of Q1 deals) had their most active two-quarter stretch since 1Q22.

To read the full Software Equity Group’ SaaS M&A and Public Market Report, click here.

Read More

M.R. Asks 3 Questions: Brian Biro, Bestselling Author & Speaker

By Article

Brian Biro is armed with degrees from Stanford University and UCLA, is the author of many bestselling titles and has delivered over 1,800 presentations on leadership, team building, and how to breaking through, globally.

His latest book, “Lessons from the Legends” has has drawn inspiration from NCAA and SEC championship-winning coaches Pat Summitt and John Wooden, offering a championship team building formula applicable to business leaders, parents, and educators.

M.R. Rangaswami: How can business leaders adapt these principles to create high-performing teams in the corporate world, especially in the face of rapid changes and challenges?

Brian Biro: In our business world today, the obsession with accelerating technology and AI, it can be easy for leaders to forget that they are actually in the PEOPLE business. It is how you and and your team grow that will determine how far you can go. Both Coach Summitt and Coach Wooden realized they coached people more than basketball.

One of the leadership practices they each used was to guide their teams to focus on controlling their controllables. They believed they did not control results, but they could powerfully impact the effort, energy, attitude, and constant drive to improve that would lead to breakthrough results. Both of these great coaches demonstrated immense humility which also had a profound impact on those they led.

They set the example of GIVING credit and taking responsibility. And, it’s amazing what’s accomplished when no one cares who gets the credit. They became shining examples of personal responsibility, always seeking to learn and improve rather than blame.

Perhaps most importantly, both coaches focused every single day on the models of personal excellence they developed: Coach Summitt on her Definite Dozen, and Coach Wooden on his Pyramid of Success. That dogged consistency on the principles and qualities they most sought to develop ignited unstoppable cultures at UCLA and the University of Tennessee. Nothing is more important to long-term success in the business arena than a powerful, unstoppable culture.

M.R.: How can business leaders develop resilience in themselves and their teams, particularly in the dynamic and unpredictable business environment we often face?

Brian: Business leaders can learn from these two Legends to develop extraordinary resilience through a few very powerful leadership practices the coaches lived by. First, they were blame-busters! If you think about blame in the context of time, blame is always about the past. You cannot change the past. So, whenever you find yourself in blame, you are in the past. Coach Summitt and Coach Wooden did not pretend that mistakes weren’t made or than their players and coaches did not make mistakes. But, when mistakes were made, rather then getting stuck in blame, they moved forward by asking what they could learn from setbacks and mistakes to get better. As a result, their players and coaches weren’t terrified of making mistakes and focused instead on constant learning and improvement.

Second, both coaches lived by the practice to end every game, workout, and day on a positive note because that would create a springboard into tomorrow. This simple leadership practice created a positive energy about the future…that something good was coming tomorrow.Finally, both coaches constant focus on controlling controllables and letting go of comparison was especially important when dealing with challenges and setbacks. Whenever we focus on what we DO control rather than on what we don’t, we generate momentum, confidence, and resilience. 

M.R.: How can business leaders balance passion and composure in their leadership style, and when might one approach be more effective than the other?

Brian: Pat Summitt was known for her passion, while John Wooden was characterized by his calm and even-keeled approach. I wrote this book in part to demonstrate that styles are far less important in the long-run than core values, humility, and character. Each coach was 100% authentic in their style and intensity. Both were passionate about teaching and giving credit. Both believed there are no over-achievers, that we have more in us than we know, and both were passionate about helping everyone they led to rise as close to their potential as possible. The great value of John Wooden’s style was his focus on listening. For Coach Summitt it was her intensity about demanding one’s best. Though they went about it differently, both were incredibly PRESENT for others. Only by being fully present do we communicate to others that they are important, significant, and that they matter. 

M.R. Rangaswami is the Co-Founder of

Read More

The Art of Bootstrapping: Building Success From The Ground Up

By Article

Bootstrapping businesses (running a business without external capital) has been a common practice since the inception of entrepreneurship. While bootstrapping is an understated way of growing a business, it has stood the test of time in all market cycles if done correctly.

This Allied Advisor report profiles operational metrics for bootstrapped companies and has examples of businesses which scaled successfully using this route before taking in significant capital infusion or exiting.

Here are a three key points of the report, and the highlights from their Netcore feature; a company that bootstrapped a $110M ARR SaaS Company.

1. Growth Fueled by Strong Operational and Capital Efficiency

    Rule of 40 (Growth rate plus profit margin of at least 40%) is considered as an investor’s benchmark for an investable company. Empirical data indicates that bootstrapped companies broadly score higher on Rule of 40 compared to VC-backed businesses at most revenue levels; VC-funded companies are typically growth focused, often at the expense of profitability.

    2. Cost Structure of Bootstrapped vs. VC-Funded SaaS Companies

    Bootstrapped companies prioritize operational efficiency through their lean cost structure. With all spending on marketing and R&D sourced from generated income, they focus keenly on optimizing operations. For example, the chart below shows, the marketing cost for bootstrapped is at 17% as compared to VC-funded companies at 27%.

    3. Compelling Exit Valuations for Bootstrapped Companies

    The selection of an exit strategy for a startup is influenced by its growth stage, market conditions, and strategic objectives. The two exit options that are presented at the end are an IPO or a M&A, the former being more frequent. Companies opting for merger exits command a higher EV/Revenue of 17.2x, compared to those choosing to go public with a multiple of 8.8x.


    Bootstrapping a $110M ARR SaaS Company: The Netcore Story

    Rajesh Jain, founder of Netcore, only bootstrapped his business profitably to over $110M in ARR, but also purchased Unbxd for $100M in cash to further accelerate growth. Having grown two successful profitable bootstrapped businesses, here are a few highlights from his interview with Allied Advisors:

    AA: You coined the term Proficorn – can you advise what this is for the benefit of founders and

    RJ: Proficorn is viewed as an antithesis of Unicorn, wherein founders own complete stake in the company instead of investors.

    The trick is to combine profitability of bootstrapped businesses with scaling

    AA: You have done it twice in your life – what is the magic to doing this?

    RJ: The key is to prioritize a path to profitability from day one, steering clear of a growth-at-all-costs mindset.

    Forging a path to profitability swiftly drives frugal operations, ensuring judicious use of limited capital.

    Successful bootstrap businesses should seek a balance between profitability and growth, exploring various avenues for profit instead of seeing it as an exclusive choice against growth.

    AA: How did you bootstrapped Netcore to $110M ARR without outside capital?

    RJ: High gross margins made our company profitable without external funding.

    Establishing the initial profit engine is vital to sustain and capitalize on market opportunities, avoiding stagnation.

    AA: How did you acquire a $100M business – Unbxd, without using outside capital?

    RJ: Our company consistently saved and reinvested profits through incremental growth via product development and strategic acquisitions.

    Acquisition was focused on a company with business in key markets (US, UK, Australia) with complementary offerings. Unbxd, with its B2B revenue and India-based team, was an ideal fit, funded by internal accruals.

    The vision lies in merging customer data with the product catalog, a unique strength in B2C-Martech, shaping the company’s future.

    To read more of the easily presented data in this Allied Advisor Report, click here.

    Thanks to Gaurav Bhasin, Managing Director at Allied Advisors for pulling together this report.

    Read More

    M.R. Asks 3 Questions: Founder and CEO of Kasada, Sam Crowther

    By Article

    Sam Crowther created Kasada when he was only 19 years old, in a small shipping container under the Sydney Harbour Bridge.

    Nine years later, Sam has tripled his team, raised over 39 million USD, protects more than 150 billion in annualised eCommerce and more than 100 million internet users daily. Last year he made the Forbes 30 under 30 list, and their aggressive approach to predicting and preventing bot attacks and online fraud is creating a safer, more secure digital experience for everyone.

    M.R. Rangaswami: When it comes to bots, what are the most pressing challenges for enterprises today?

    Sam Crowther: Attackers are driven by money and the use of bots has proven to be a quick, effective way to acquire and resell goods (like tickets, electronics, and shoes) and commit online fraud for huge profits. Accessibility of bots has become democratized where anybody can purchase a sophisticated bot (increasingly offered as a service) at little to no cost and use them without needing a technical understanding.  

    Another part of the challenge is that enterprises have historically been relying on inadequate, costly bot defenses. Traditional tools are static–allowing time for botters to reverse engineer and get past them. Or they require human interaction (like annoying CAPTCHAS) which frustrate the user experience. Attackers are incredibly motivated to work around these defenses—constantly changing their attack methods to stay a step ahead of defenders. This is all incredibly costly for businesses–both in the costs incurred by playing whack-a-mole with ineffective defenses and the bots themselves as processing fake traffic is expensive.

    There’s a huge disparity. Users of bots are able to evade defenses at little to no cost, yet many businesses spend millions of dollars in an attempt to protect against bots and yet are unable to move at the increasing speed of the attacker – the bots are winning, and Kasada set out to change this paradigm.

    M.R.: How is AI changing the bot landscape?

    Sam: Bots are being used to exploit AI to damage brands, breach systems, and cost businesses a lot of money.

    One of the most immediate areas is using AI to bypass CAPTCHAs. AI image recognition has gotten good enough that it can bypass even the newest forms of CAPTCHAs at very high degrees of accuracy and at a speed far quicker than a human can. That’s no good because the only ones fooled by CAPTCHAs nowadays are humans, not the bots. Resulting in a horrible user experience for those that decide to use them – and doing very little if anything to secure the experience.

    One of the biggest existential threats to online businesses today is that AI companies have embraced web scrapers (also known as web crawlers) to haul in huge volumes of data from other companies to train their large language models (LLMs). This has ramifications for businesses that rely on website traffic for monetization, in addition to content creators who don’t receive acknowledgement or payment for their work. These persistent web scrapers can be extremely difficult to stop and detect.

    Bots are also being used to reverse engineer businesses’ customized LLMs and expose private data or intellectual property via prompt injection attacks. Incorporating generative AI into web applications and mobile apps is opening-up a whole new attack surface aimed to exploit and extract personal information.

    M.R.: How is Kasada addressing customers’ bot challenges?

    Sam: One of the keys to success is to take away the ability for an attacker to be successful, impacting their ability to generate a profit. That means making it as costly and frustrating as possible to attack in order to disincentive the adversary.

    That’s exactly what we’ve done. We have created a system with a proprietary language that dynamically changes itself to present differently every time someone tries to figure it out. This makes it very time consuming and frustrating to even begin understanding the Kasada defense techniques being applied. In addition, we study our adversaries to understand the tools and techniques they use to evade detection. We anticipate these and build layers of resilience in our system so they are forced to raise their game and constantly evolve their methods.

    Bot detection is a game of cat and mouse. We stay ahead by making sure our dynamic platform and team of experts pivot quicker than the adversary. We make it effortless for our customers to use without any management, and never impede the user experience with visual CAPTCHAs. This is where early market entrants have fallen down — their defenses are static and don’t move fast enough — and they place all of the management overhead on the business which is not a path to success. We’ve learned from our predecessors to create something that not only works better to stop modern bots, but is incredibly simple to use so our customers can focus on growing their business, instead of defending it.

    M.R. Rangaswami is the Co-Founder of

    Read More

    20 Factors to Track When Valuing Your Software Business (by SEG)

    By Article

    For more than two decades, Software Equity Group has honed a proprietary methodology to evaluate software companies and assess their readiness to exit.

    By considering critical aspects of the software industry, such as market demand, competitive positioning, financial performance, and more, we have not only achieved a leading first-pass success rate but consistently secured and often surpassed the valuation multiples our clients aspire to, guiding software operators to successful exits and garnering industry accolades along the way.

    Sandhill Group has summerize the 20 Factors to Track When Valuing a Software Business; however the full report from SEG examines the quantitative and qualitative measure measurements that were used.

    Those details can be reviewed, here.


    A GRR acts as a reliable compass pointing toward revenue stability. A high GRR often signals a robust level of customer satisfaction and loyalty. Customers continue to find value in the product or service, leading to sustained revenues. Conversely, a low GRR could flag potential satisfaction, product fit, or service delivery issues.


    The ARR growth rate highlights your business’s growth trajectory and future potential. It not only assures buyers and investors of the prevailing market demand, strong productmarket fit, and unique product differentiation but also attests to your capability to capitalize on that demand. While a significant growth spike in a single year is promising, consistent growth over several years, as indicated by the Compound Annual Growth Rate (CAGR), is more telling.


    EBITDA margin is a key financial metric that provides insights into the efficiency of the company’s operations and the amount of cash flow generated by core activities. By stripping away factors that can vary greatly between companies, such as interest payments, tax strategies, and amortization, it reveals the underlying profitability of business operations and allows for apples-to-apples comparisons with other SaaS companies.

    4) RULE OF 40

    For SaaS businesses, balancing growth and profitability can be challenging. Rapid growth often leads to higher costs, while a sole focus on profitability might stifle expansion. The Rule of 40 provides a holistic view, helping companies determine if their growth strategies are sustainable.


    Gross margins highlight the profitability efficiency of software companies, which often have high multiples due to their robust profit margins. A solid gross margin indicates more profits available for business reinvestment, making it a crucial measure of a company’s financial health and long-term profitability.

    6) LTV:CAC

    LTV:CAC is an important unit economic metric, encompassing several levers within the business, which speaks to the efficiency of the business model. By dissecting this formula, one can derive insights into essential components like ARR, Gross Retention, sales and marketing efficiency, and ROI. This metric indicates how efficiently a company’s customer acquisition strategy affects its profitability. Ideally, the revenue a customer brings should exceed the cost to acquire them. A high ratio means the business is seeing a good return on its sales and marketing investments.


    Too much revenue from one customer can be perceived as a risk to potential buyers or investors. If that customer leaves, it could disproportionately impact the company’s revenues and profitability.

    8) TOTAL ARR

    ARR indicates the yearly recurring revenue a company expects, making it a key signal of its overall health and growth potential. Buyers and investors pay close attention to this metric, especially for software companies in the lower-middle market segment.


    NRR offers a glimpse into customer satisfaction, how well the product fits the market, and the company’s skill in boosting ARR from current customers. A high NRR suggests growing revenue from the existing customer base, indicating strong loyalty, effective upsell strategies, and a product that consistently meets user needs.


    Revenue Growth offers an encompassing perspective on a company’s fiscal health, extending beyond the insights provided by ARR growth, which we’ve recognized as a significant metric with our “High” weightage. This insight lets us discern exactly what is propelling the company’s growth. For buyers and investors, growth primarily driven by ARR is often more attractive for the reasons we stated previously.


    Logo retention provides insight into a company’s customer satisfaction levels, the effectiveness of its customer retention strategies, and the overall stickiness of its products and services. A high rate suggests that a company has a durable customer base and demonstrates that customers find value in the company’s offerings. It also implies that your company’s customer relationship management efforts are effective and capable of building long-lasting ties.

    As such, this reflects positively on your brand reputation and market position, which can, in turn, help attract new customers, enhance market share, and positively impact the company’s enterprise value.


    The delivery model is the core of the business, shaping how customers interact with products and services, how updates are deployed, and how costs are structured. It dictates user experience, rollout strategies for updates, cost dynamics, and, crucially, the scalability of the software’s architecture.


    The pricing model is vital because it directly impacts revenue visibility, which is the ability to predict and anticipate future revenue streams. A clear understanding of future revenue is essential for planning, scaling operations, and making informed business decisions. Stable and predictable revenue is particularly attractive to stakeholders as it minimizes uncertainty and risk.


    Product differentiation is the key to highly valued software businesses. Differentiated products usually lead to strong customer retention, significant growth, and increased customer loyalty. In contrast, commoditized products struggle to compete and face higher churn rates. Product differentiation can be achieved through usability, product depth and breadth, vertically focused or purpose-built solutions, and so on. Differentiated products aren’t easily duplicated overnight, which is important to buyers and investors.


    Market attractiveness is crucial as it delineates the opportunities and limitations of a company and its offerings. In expansive, growing, and less congested markets, a company’s value is particularly enhanced when it occupies a unique position. Market attractiveness is shaped by elements such as growth potential, long-term profitability, product relevancy, and the ability to adapt to changing consumer behaviors.

    Recognizing these factors informs potential buyers about a company’s growth trajectory and responsiveness to market shifts.


    The technology behind a software product directly influences the product’s performance and user experience. A welldesigned and efficient tech stack demonstrates a company’s ability to scale its operations seamlessly and accommodate a growing customer base without compromising performance.


    The management team is pivotal in guiding a software company’s trajectory and success. Buyers and investors prioritize leadership with a track record of executing business plans and possessing a forward-thinking vision. Beyond product and financials, the team’s past successes suggest future potential, elevating valuation. With stable leadership, succession planning, and a positive culture, the company’s value in the eyes of stakeholders rises. While these qualities are beneficial, they aren’t all mandatory for a company to be seen as valuable.


    Expanding markets hint at increased adoption, making it potentially easier for businesses to make sales. Even in situations where the market might not be expanding significantly, there can still be robust growth opportunities. If the Total Addressable Market (which we will discuss next) is sizable and a company’s solution offers substantial value, it can thrive. This is often the case with many of our clients who are displacing legacy technology or automating manual processes.

    Their strong value propositions provide solid growth prospects despite rapid market growth. Awareness of market growth rates facilitates goal setting for companies and enlightens potential buyers and investors about the prospective sales momentum.


    A substantial TAM indicates vast growth and profit potential, but it’s crucial that the company can effectively tap into revenue from this market. Conversely, a smaller TAM might suggest a niche, potentially limiting long-term growth. For SaaS companies that have chosen not to raise a substantial amount of money through venture capital, it’s essential for a company’s TAM to be expansive enough for growth yet not so vast that it becomes a magnet for intense competition, which could hamper its ability to maintain or even capture market share. The ideal TAM is a balance: large enough for substantial growth potential while maintaining your strong differentiation but not so immense that it becomes a
    competitive battleground.


    By examining the direction of KPIs, potential acquirers can ascertain whether the business is progressing favorably or veering off course. While the current metric might not represent an ideal picture, a positive trend signifies potential. An upward trend can be leveraged to paint a compelling narrative of a promising future, which is particularly vital for attracting buyers and investors. Even if today’s numbers aren’t optimal, a trajectory pointing in the right direction can bolster confidence, enabling stakeholders to envision and advocate for the business’s longterm potential. Similarly, from a market standpoint, if the market is undergoing a significant inflection point and driving demand for new innovative solutions, a company offering such solutions can have a positive impact on value.

    To view SEG’s full report in detail, click here:

    Read More

    M.R. Asks 3 Questions: Gaurav Dhillon, Chairman and CEO of SnapLogic

    By Article

    Gaurav Dhillon is the Chairman and CEO of SnapLogic, overseeing the company’s strategy, operations, financing, and partnerships. Having previously founded and taken Informatica through IPO, Dhillon is an experienced builder of technology companies with a compelling vision and value proposition that promises simpler, faster, and more cost-effective ways to integrate data and applications for improved decision-making and better business outcomes. 

    M.R. Rangaswami: As we’re heading into the new year, how can leaders begin to make room in budgets to take advantage of AI?  

    Gaurav Dhillon: As generative AI continues to be the topic of conversation in every boardroom, the question board members are asking leaders is not whether they can afford to invest in generative AI but what they will lose if they don’t. As AI-driven technologies continue to expand in reach,, there is a new baseline for business operations, which includes evolving customer expectations. Any sophisticated task with the potential to be automated will be automated.

    Companies must adapt to maintain a competitive edge, and until a company strategically harnesses AI, it will struggle to meet the industry’s new productivity standards. As organizations begin to prepare for AI implementation, it’s important for them to prioritize reducing their legacy debt—or what is commonly known as technical debt. 

    The challenge with legacy tech stacks is that they are built around older and outdated languages and libraries, which inhibit an organization’s ability to successfully integrate new applications and systems, including GenAI tools. Modernizing infrastructure is key to ensuring enterprise data is ready for widespread AI adoption and use across the business. AI adoption is increasingly becoming integral to a company’s relevance, efficiency and effectiveness. 

    M.R.: What do you believe are the biggest inhibitors to AI adoption in the workplace? 

    Gaurav: The biggest inhibitors of AI adoption in the enterprise are rooted in the fact that people look at consumer AI tools like ChatGPT and make comparisons to their own products. AI is fueled by data, and enterprise AI needs to have guardrails on what type of data it can access. Today, we are still at the hunter-gatherer stage with business data. 

    Another inhibitor to AI adoption for organizations is security. Ideally, businesses want to leverage AI tools to ask questions about customers, but in order to get to this stage, organizations first need guardrails to ensure that the data is handled and accessed securely. The stakes for consumer AI are low because if you ask ChatGPT to write you a recipe for dinner and it turns out bad, you lose a meal. The bar for enterprise AI is much higher; if a customer looks to your business for answers and solutions, people’s jobs can be at stake.

    M.R.: As a two-time founder, what key lessons have you learned that you believe every leader should be aware of, especially in the midst of today’s AI revolution?

    Gaurav: The hardest lesson I’ve had to come to terms with is that product market fit is a scientific art. Companies can do and build amazing things at scale, but that alone won’t determine or define its success. Closely engaging with and listening to early adopters and customers is the only way successful business leaders can discern and establish what the ideal product-market fit is. As a founder and entrepreneur, it’s critical to be a part of this exploration from the very start. While motions like scale can be delegated, product market fit cannot.

    Read More

    M.R. Asks 3 Questions: CEO and Co-Founder of Rhythm Systems, Patrick Thean

    By Article

    Patrick Thean isn’t a boxer, but he loves to quote Mike Tyson in saying, “Everyone has a strategy until I punch them in the mouth.” Through his years as a CEO, serial entrepreneur, and coach to other company leaders, he has become an expert not only in crafting visionary strategy, but in executing with mastery.

    Patrick is a USA Today and Wall Street Journal bestselling author. With his book Rhythm: How to Achieve Breakthrough Execution and Accelerate Growth, he shares a simple system for encouraging teams to execute better and faster. He reveals early signs of common setbacks in entrepreneurship and how to make the necessary adjustments not only to stay on track, but also to accelerate growth.

    His work has been seen on NBC, CBS, and Fox. Patrick was named Ernst & Young Entrepreneur of the Year in 1996 for North Carolina as he grew his first company, Metasys, to #151 on the Inc 500 (now called the Inc. 5000). 

    Currently the CEO and Co-Founder of Rhythm Systems, Patrick Thean is focused on helping CEOs and their teams experience breakthroughs to achieve their dreams and goals. 

    M.R. Rangaswami: Crafting a compelling vision is often cited as a critical aspect of strategic leadership. How do you recommend leaders go about developing a clear and inspiring vision for their organizations, and what are the key components that should be included in a well-defined vision statement?

    Patrick Thean: If you want to create a compelling vision, you first need to change how you approach strategic thinking. Strategic thinking should not be something you do randomly or squeeze into action-focused meetings. You need to get into a Think Rhythm. Start having regular Think sessions where you and your team reflect on your past achievements and challenges and imagine an inspiring future together. 

    During your Think sessions, you really have to step back from daily operational work and focus on the future of your business. Make it clear to your team that this time is for thinking only – not for finalizing goals or jumping into action. Play around and have fun brainstorming! Don’t shoot any ideas down. 

    When it comes to crafting a vision, use your Think sessions to dream big. Let your imagination run wild as you imagine what your company could look like five, ten, or even twenty years from now. Experiment with exercises like the Destination Postcard (which asks you to envision your company one year from now, but can be adapted to longer amounts of time). Be specific and include elements like the impact you want your company to make and the growth you want to achieve.

    Once you and your team have talked through these ideas and have gotten excited about a shared vision, craft a vision statement that will inspire the rest of your employees to step boldly into the future with you. Avoid corporate buzzwords and “fluff” (marketing language). The vision should be easy to read, and it should connect with people’s hearts. You want the rest of your company to feel just as excited about the future as you are!

    M.R.: Once a vision is crafted, what strategies do you recommend for fostering alignment across different teams and departments to achieve this vision?

    Patrick: Alignment starts at the very top. The CEO and leadership team need to clearly and repeatedly communicate the company’s vision to all other employees. And as you’re doing this, you can enter your second Rhythm – the Plan Rhythm.

    During the Plan Rhythm, you need to come together with your leadership team every quarter and every year to discuss, debate, and agree on priorities that move the company in the right direction. Each person on the team should know what they are responsible for accomplishing. Break each priority down into key tasks or milestones to avoid falling into the strategy execution gap.

    Then you will cascade the company’s plan down to the departments. They will follow the same planning process to agree upon their own priorities, which align with and support the company’s goals. Teams need to talk cross-departmentally, too, to ensure alignment is horizontal as well as vertical. They need to plan for smooth project hand-offs to avoid waste, rework, and worst of all: disappointed customers.

    Alignment isn’t just important when it comes to executing a plan with your team. Cultural alignment is important, too. Everyone on your team needs to be aligned with your company’s core values and have the right mindsets. This will ensure that they are behaving in ways that create the kind of work culture you’re trying to foster. If they’re seriously misaligned, you might see behaviors that create tension among the team or spin a priority off its track.

    Even when you have a team of A-Players who are aligned on your core values and aligned on a plan, you need to keep realigning week after week by getting into a Do Rhythm. Hold Weekly Adjustment Meetings to discuss the progress of your top priorities. This practice will give your team thirteen opportunities to take action and reorient when your goals are veering off track. 

    M.R.: What advice do you offer to leaders striving to cultivate a high-performance mindset within their teams, particularly during times of change or uncertainty? 

    Patrick: If you are leaving performance conversations to once or twice a year, you are actually decreasing employee engagement. Nobody wants to wait six or twelve months to hear what they’ve been doing well and what they need to work on. A disengaged employee doesn’t perform well and is more likely to leave, which costs valuable organizational knowledge, time recruiting and training a new hire, and of course – money.

    You need to take a proactive approach to performance instead. Make sure every person on your team, from the C-suite to the frontline employee, understands their role and responsibilities. I recommend using Job Scorecards to make this clear and easy to understand for employees and managers. When people know what is expected of them and what goals they should be working towards, they’re more engaged and they do better work. They also don’t waste time working on the wrong things that won’t really benefit the company. When performance reviews roll around, they will already understand what they’re going to be rated on, because they’ve been working on it the whole time in accordance with their Job Scorecard. This takes much of the fear of the unknown out of the process.  

    Week to week or month to month, managers should be checking in with their employees by holding 1:1s. A regular 1:1 cadence encourages transparency and accountability. It’s a candid conversation that prompts ongoing feedback in both directions. Managers should also use these meetings to provide coaching and help employees grow their skills and careers. 

    This is especially important during times of uncertainty, when employees may start to question their job security. If the line of communication is open between manager and employee, you help reduce your employees’ fear of being blindsided by bad news. And when managers are focused on growing and developing their people, employees will feel cared for and engaged. They will do their jobs much better than they would if they were kept in the dark about their own performance.

    M.R. Rangaswami is the Co-Founder of

    Read More

    M.R. Asks 3 Questions: James Harold Webb, Chairman and CEO of Paradigm Development Holdings

    By Article

    James Webb says the difference between success and failure often comes down to whether the person thinks big in the early stage of the business.

    Author of Redneck Resilience: A Country Boy’s Journey To Prosperity, James is an investor, philanthropist and successful multi-business owner. He began his entrepreneurial journey in the health industry as the owner of several companies focused on outpatient medical imaging, pain management and laboratory services.

    Following successful exits from those companies, James shifted his focus to the franchise world and developed, owned and oversaw the management of 33 Orangetheory Fitness® gyms, which he sold in 2019. Not one to stop, he currently has two additional franchise companies in various stages of growth.

    His insights as a life-long entrepreneur offers great insights for those looking to branch out with building businesses they own, and connecting it with their big-picture plan.

    M.R. Rangaswami: What are the top two most common missteps a young entrepreneur makes in their first two years of business?

    James Harold Webb: There are many mistakes an entrepreneur can make during the start-up stage of their business. Taking money “off the table” too quickly can lead to an assortment of problems, including holding back building your infrastructure, expansion, and cash shortage. Other than my “salary” (if needed), I tend to leave all the money back in the business for several years. The only exception to that is determining any income tax consequences and taking what I call a “tax distribution.” Solely for the purpose of paying the prior year’s income taxes or quarterly income tax payments.

    I see too many 8to5ers who are not putting in the time or effort it takes to get a business off the ground and profitable. When you are ready to stop for the day, make one more phone call or send out one more email. Solve one more problem. Unbox one more package. Whatever it takes, just work harder than anyone else.

    M.R.: How important is a leadership team in the early stages of building a business? What (if any!) budget should people allocate to that leadership team? 

    James: Leadership is one of the key elements of a successful business. Creating a corporate culture from the beginning is crucial. Establishing relationships is also extremely high on the leadership list, whether it be with fellow corporate staff, employees, vendors, banking, or even competitors. Listen to people. Invest in people. Take the time to recognize people and to hold yourself accountable to them. Relationships will define your success.

    M.R.: How can someone who is just starting their business beat the odds and not fail in the first five years? 

    James: Work harder than anyone else.

    Hope for the upside, but always plan for the downside. Stay focused on your upside and driving your business to success, but have a contingency plan for the “what ifs.”

    Build a solid infrastructure before you reap the benefits of your venture. Find the right people who are dedicated to helping you reach your dream of success.

    With employees, be clear in your expectations, hold them accountable, and be available to assist and direct as needed. Contrary to popular belief, you can be a boss and a “friend.” If they can’t get it done and you’ve done the previous, then it’s time to let them go.

    M.R. Rangaswami is the Co-Founder of

    Read More

    M.R. Asks 3 Questions: Sunil Sanghavi, CEO of NobleAI

    By Article

    2023 was undoubtedly the year that AI barnstormed our tech consciousness. Trained on massive amounts of public data; AI generated cool new images, wrote up content summaries, and delivered seemingly original work in the blink of an eye. Could this also be the future of helping companies balance the need for sustainable, green innovation against resource and supply chain constraints?

    Artificial intelligence offers promise for accelerating materials/formulation R&D. But AI for science needs to be uniquely focused, applying small, curated use-case AI models mapping to multiple scientific principals at a time, to speed scientific discovery. This has the potential to be a game-changer across a wide range of fields, including medicine, agriculture, engineering and more, which is why Sunil believes that 2024 will be the year for specialized AI.

    Sunil Sanghavi is currently CEO of NobleAI, a pioneer in Science-based AI solutions for chemical and material informatics. He has a rich and diverse operating background in deep-tech companies over 40 years. Most recently, he was Senior Investment Director at Intel Capital, where he invested in AI/ML hardware and software companies including Motivo, Untether AI, Syntiant, and Kyndi. He attended the MSc Chemistry program at the Indian Institute of Technology Bombay and obtained a BSEE from Cal Poly, San Luis Obispo.

    M.R. Rangaswami: You have an impressive resume leading a variety of companies. What led you to NobleAI at this time? 

    Sunil Sanghavi: Generative AI dominated the discussion in 2023, and will certainly continue to be a fascinating area to watch. At this point most people have experimented with the many available LLM-based tools and understand how they can help us with everyday tasks. But what I find most exciting is the opportunity to apply AI to speed scientific discovery. Science based AI (SBAI) has the potential to be a game-changer across a wide range of fields, including chemistry, materials, energy and many others to speed scientific discovery.  

    That area is very exciting to me and is what drew me to NobleAI, where we’re showing the power of Science-Based AI (SBAI) to help companies achieve their goals. As opposed to large language models (LLMs), which is what GenAI is (basically scraping massive amounts of publicly available data), SBAI uses SSMs or Smaller Science-infused models where we apply the power of AI to private, industry- or company-specific data sets, and add to that applicable scientific laws and any available simulation data. This elegant process presents incredible opportunities for advancements to develop or improve chemicals, materials and formulations while also tackling pressing issues for companies like cost, supply chain and customer satisfaction. And unlike LLM-based solutions, SBAI is an optimized ensemble of models, optimized for each specific use case. Our ability to do this for literally hundreds of use cases in 3 or so person-months each and at a deterministic cost is what allows us to offer customized solutions while being able to scale NobleAI’s business.

    M.R.: What are the challenges to innovation using SBAI?

    Sunil: As is the case with any technological advances, it’s a change in mindset which will be the most immediate challenge. Scientists and researchers are trained to advance or eliminate solutions based on empirical experimentation. This can be cost-prohibitive, and is always by its nature time-consuming and limited in scope. In fact, research into chemical and specialized materials … an industry that spends $ 100 billion per year on R&D …  has not experienced much innovation in the past 50 years for this very reason. Developing chemicals and materials is incredibly complex, often requiring experimentation across a multitude of parameters so that researchers can understand interactions of hundreds of different ingredients interacting at scales ranging from molecular to formulations. But now, AI for science is opening the door to a better approach and NobleAI is leading the charge. The goal is to use AI to more rapidly explore a greater range of chemicals and materials in software (i.e., before going to the lab) saving potentially months or even years of R&D time. 

    M.R.: Where do you see this really taking off first? What are the emerging trends that are most exciting?

    Sunil: To me the most exciting possibilities are in the area of sustainability. There’s a big push to improve the safety of material ingredients for both the environment and human health. For instance, more people, organizations and regulators are now talking about the need to replace forever chemicals. But anytime there’s a need to replace an ingredient it can be a real challenge for companies to find substitutes. That’s why you often see the knee jerk reaction to fight a new environmental regulation. But the great thing about Science -Based AI is that we can turn that around. We can support companies and get behind sustainability initiatives. SBAI can not only help companies stay ahead of the shifting regulatory environment but we can support companies to get behind sustainability initiatives. I call what we do “Good AI, For Good”.

    M.R. Rangaswami is the Co-Founder of

    Read More

    The Annual SaaS 2024 Report

    By Reports

    Software Equity Group’s annual report is in, revealing that SaaS is here to stay.

    As the report details, many in the technology industry, the story of 2023 was all about artificial intelligence, its rapidly advancing commercial applications, and the speed and extent with which it will impact the world we live in, both from a business and personal perspective.

    4 SaaS components of 2023 that will impact what we see in 2024.

    1. The advancement of generative AI and its impact on software and SaaS companies, both as users and creators of AI, was a top story in 2023 and one that will be front and center in 2024 as well.

    However, quietly and perhaps a bit behind the scenes, another storyline proved to be just as important in 2023: the resilience of the U.S. economy and subsequent cementing of software and SaaS’s place as a key pillar driving digital transformation globally.

    2. Inflation decreased by nearly half (with the CPI dropping from 6.5% in December 2022 to 3.4% in 2023), interest rates stabilized, and the labor market remained strong (unemployment rate a 3.7% with 216k jobs added in December).

    3. Software and SaaS companies pivoted towards operational efficiency, and fortunately for the U.S. economy, many of these companies were successful in this endeavor. The result was a fantastic year for the SEG SaaS IndexTM, with the Index increasing 34% YOY, outpacing the S&P 500 and Dow Jones, and trailing only the Nasdaq (43% increase) among major indices.

    On the M&A side, there were over 2,000 SaaS transactions, making 2023 the second strongest year on record for SaaS M&A, only narrowly trailing 2022.

    4. While AI garnered a lot of the hype in 2023, an equally important story is the strength and resilience of the software ecosystem. 2023 was another proof point that SaaS is “here to stay.”

    4 Macroeconomic Outlooks for 2024: Inflation, interest rates, employment, growth and politics

    1. Inflation continues to decrease, finishing 2023 at 3.4% YOY compared to December 2022. The underlying core CPI, which strips out volatile food and energy prices, measured 3.9% in December 2023, its lowest YOY change since May 2021. Though additional cooling is still needed for inflation to reach the 2% annual target the Federal Reserve sets, the progress made in 2023 is encouraging.

    2. The prospect of the Federal Reserve cutting interest rates is coming into focus. They will closely watch inflation and the unemployment rate (which remains solid at 3.7%) as it plots the course through this year.

    The timing of potential cuts will greatly impact publicly traded SaaS stocks and the M&A markets, as the potential for a lower-cost borrowing environment would be a welcome sight to these markets.

    3. What about a recession? 2023 growth is now expected to come in between 2 and 3%.
    GDP growth is expected to decline slightly in 2024 but remains positive at around 2%. This scenario avoids a recession altogether and supports a healthy economic environment.

    A scenario in which the U.S. beats GDP estimates again provides an upside case for publicly traded SaaS stocks in 2024. This possibility is further bolstered by the recently released Q4 GDP data, in which the U.S. GDP grew 3.3%, beating consensus estimates.

    4. The economy will be a primary focus on the 2024 campaign trail. However, the reality remains that the Federal Reserve dictates monetary policy independent of political election cycles.

    Election risk is still present due to the divisive nature of the current U.S. political environment,
    albeit much less discussed than during the last cycle.

    Globally, geopolitical risks include regional conflicts in the Middle East and their impact on oil prices, the ongoing Russia-Ukraine war, and tensions between China and Taiwan.

    To read the details of Software Equity Group’s 2024 SaaS Report, click here.

    Read More

    5 Most Impactful Factors In Valuation of Technology Companies

    By Article

    The turbulent markets of 2022-2023 and volatility in the M&A environment has brought the topic of valuation to the forefront in many of our discussions with founders and investors.

    Regardless of market ups and downs, the factors that are most impactful to valuation remain relatively constant, with some standards changing with market cycles as witnessed over the past decade. Safe to say, valuation continues to be both art and science.

    Allied Advisers put together this article as a refresher on some of the most important valuation factors in the current market for technology companies; we hope our report also services as broad guidance to founders, executives and investors in achieving an optimal valuation outcome for their business.

    It is often said that valuing a business is more an art than a science. Another assertion is that
    valuation is in the eye of the beholder, akin to beauty. There is truth in both these statements since
    enterprise valuation is impacted by several variables, not all of which can be quantified, and
    perception of future prospects of a business can be quite different depending on the biases of the

    Regardless of this sense of mystery and fuzziness about valuation, there are several fundamental
    factors that influence the value of a technology business.

    In this article, we cover five important elements that have a distinct bearing on the valuation of technology companies, noting that many of these factors apply to businesses in other sectors as well.

    1) Scarcity in a Large Market
    A business that is the only player, or one of just a few players, in a large end market is likely going
    to be seen as being valuable since there are limited substitutes for the scarce solution offered by
    that company. It is simple supply-demand dynamics – when there is clear demand for a product in
    short supply, the price of that product goes up.

    (Read more)

    2) Significant Differentiation from Competitors
    Often referred to as “USP” or unique selling proposition, differentiation of a technology business is
    important to valuation since it creates scarcity and sets the business apart from its competition.
    Differentiation may come from unique product features, ability to address challenging use cases,
    performance metrics, superior UI design, ease of deployment and use, economic value to the
    customer (time to value, ROI), etc.

    (Read more)

    3) Growth vs. Profit Margin and Rule of 40; Capital Efficient Growth
    In the frothy market prior to COVID that eventually peaked in 2021, hypergrowth was the mantra
    for technology companies. Businesses that grew at breakneck pace with no heed to bottom line
    profitability attracted nosebleed valuations in private funding rounds. A popular performance
    measure of software companies called Rule of 40 (revenue growth rate + profit margin > 40%) was
    highly biased towards revenue growth; companies that grew at 100% with -50% operating margin
    (R40 metric = 50%) were highly valued due to their growth, albeit with poor profit margins, easily
    attracted capital.

    (Read more)

    4) Revenue Model and Gross and Net Revenue Retention Metrics
    Business models typical to technology product/platform companies are subscription, licensing or
    transactional. Subscription models provide recurring revenue (monthly or annually), licensing is
    usually a one-time fee, and the transactional model provides revenue per transaction.

    (Read more)

    5) Customer Profile and Concentration
    Companies that have large enterprises as customers are more likely to be able to expand revenues
    from such clients given the numerous groups within large organizations and bigger budgets for
    vendors. In contrast, having small/medium (SMB) customers limits the opportunities for large
    contracts and wallet share expansion given limited budgets. For these reasons, companies with an
    enterprise customer base have traditionally been viewed more favorably by investors compared to
    businesses serving SMB clients.

    (Read more)

    To read the full report, click here.

    Ravi Bhagavan is a Managing Director at Allied Advisers

    Read More

    M.R. Asks 3 Questions: Ofer Klein, CEO & Co-founder of Reco.AI

    By Article

    Ofer Klein is a decades-long Israeli Defense Force helicopter pilot and avid kitesurfing enthusiast who likens the adrenaline rush to being a founding CEO of a thriving security startup. It’s this unique background and experience that have been key to Ofer’s leadership style and Reco’s success. 

    Ofer and his fellow co-founders developed the platform and AI algorithm to use for counterintelligence for the Israeli government, and decided to productize the platform in 2020, which lead to the birth of Now, is a leading organization focused on safeguarding organizations with its modern, AI-driven SaaS security offering.

    M.R. Rangaswami: What security concerns are not being talked about enough today?

    Ofer Klein: There are a few. Security Keys Are Replacing Multi-Factor Authentication (MFA) – MFA is a common method of adding a second layer of security onto SaaS applications (in addition to a password). But, MFA is not the only security boundary, as SaaS applications are beginning to use security keys for secondary verification. Security keys are physical devices that use a unique PIN only available on that device to authenticate. 

    Another is Microsoft 365 and Okta Cyber Attacks. A security concern is maintaining the security of core SaaS applications, such as Microsoft 365 and Okta, as they have more cyber threats because they are foundational to making SaaS programs run, potentially becoming the next SolarWinds. Despite growing security threats, these technologies have experienced an uptick in adoption. The security built into Microsoft 365 E5 and Okta isn’t enough, however, to keep the application and organizational data stored in it secure, prompting organizations to look for dedicated SaaS security solutions.

    M.R.: Why is securing SaaS applications so important?

    Ofer: During the pandemic, cloud collaboration tools fundamentally changed the way modern organizations work. Enterprises today are adding new applications to their technology stack at an unprecedented rate, using an average of 371 SaaS applications. This dramatic increase has resulted in an elevated demand for a security solution that provides full visibility into everything connected to a company’s SaaS environment, and at the same time, ensures it complies with regulations. 

    Attempting to secure new SaaS tools with techniques that were developed for legacy on-premise systems restricts collaboration and misses a broad range of security events. Only by understanding the complete business context of an interaction can security analysts identify and interpret potential threats, and also determine the best and most efficient way to respond.

    M.R.: What role does AI play in solving SaaS security?

    Ofer: Like many sectors today, AI is revolutionizing the security industry. Leveraging AI to identify and address security vulnerabilities is rapidly growing and very effective. This is especially true for companies adding new generative AI applications into their technology ecosystems, as this can expose an organization to added risk due to the sharing of emails, recorded calls, and other data. Incorporating AI models, techniques, and processes like Large Language Models (LLMs), Knowledge Representation Learning, and Natural Language Processing (NLP) give companies greater visibility and allows them to discover potentially risky events (such as the improper use to AI tools) and be alerted to data exposure, misconfigurations, and mispermissions around a user.

    The incredibly fast adoption of generative AI tools has led to new data risks, such as privacy violations, fake AI tools, phishing and more. As a result, organizations need to establish AI safety standards to keep their customer and employee data safe. Having a SaaS security solution that can identify connected generative AI tools is critical. 

    AI is foundational to our SaaS security offering and enables enhanced functionality and effectiveness. Our proprietary and patented AI algorithm powers our Identities Interaction Graph, which correlates every interaction between people, applications, and data, and then assesses potential risk from misconfigurations, over-permission users, compromised accounts, risky user behavior, and also the use of generative AI applications. 
    One-third of organizations regularly use generative AI applications in at least one function, making it critical for SaaS security platforms to have the ability to discover anomalous behavior for both humans and machines and gain even deeper proactive threat mitigation.

    M.R. Rangaswami is the Co-Founder of

    Read More

    M.R. Asks 3 Questions: Sanjay Sathé, Founder & CEO, SucceedSmart

    By Article

    Sanjay Sathé, Founder & CEO of SucceedSmart, is no stranger to disrupting established industries. Previously, Sathé spearheaded RiseSmart’s evolution from a concept based on his personal experiences into a major disruptor in the $3B outplacement industry, becoming the fastest-growing outplacement firm in the world. In September 2015, RiseSmart was acquired for $100M by Randstad.

    Launching SucceedSmart, a modern executive recruitment platform with a unique blend of proprietary, patent-pending AI and human expertise, was a culmination of Sathé’s 15 years as a candidate of executive search and 15 years as a buyer of executive search. It was clear that the industry was living in the past and ripe for disruption.

    While many organizations across the broader HR market were embracing technology, the executive search industry continued to operate almost entirely offline and saw a lack of innovation and technology adoption over the past 50 years.

    Sathé invested time in researching both executives and corporate HR leaders to confirm his thinking and when he received a resounding “yes” to the hypothesis, he dove in to launch SucceedSmart in 2020. SucceedSmart is now on a mission to modernize leadership recruiting for director to C-level talent and fill complex leadership roles with unmatched agility, accuracy, and affordability, while promoting diversity and transparency

    M.R. Rangaswami: How can artificial intelligence (AI) positively impact HR leaders and teams?

    Sanjay Sathé: Businesses across industries have increasingly adopted AI in recent years. It’s no longer a question of whether to embrace AI technology—but when and how.

    Contrary to the misconception that AI will eliminate jobs, AI can empower CHROs, talent partners, talent acquisition teams, hiring teams and other employees to work more strategically, and improve diversity and inclusivity. By automating routine tasks, AI also frees up time for HR professionals to focus on the “human” side of human resources and build relationships with candidates and employees.

    From an HR perspective, AI automates tasks such as talent sourcing, resume screening, and interview scheduling, and helps centralize all candidate information in a streamlined platform. AI technology also unlocks insights about the hiring process and candidate experience to drive improvements over time. Leveraging AI also minimized conscious and unconscious biases in the hiring process by matching candidates with jobs that align with their accomplishments, skills, and experience.

    M.R.: What are some of the top challenges in executive recruiting today and how can businesses overcome them?

    Sanjay: Leadership has an immeasurable impact on business success and executives are among the most critical employees at any organization. Yet, despite increased turnover, business velocity, and competition, executive search has remained devoid of innovation and technological advancements for half a century.

    The traditional executive search process can take several months—leading to a poor candidate experience, as well as lost productivity and revenue as roles go unfilled. The approach is transactional, exclusionary, clubby, time-consuming, and expensive. Not only is the pricing exorbitant, but in retained search, corporations may have to pay all those fees and still not get a candidate. And the same executives are often passed around between firms, leading to a limited talent pool.

    Embracing modern executive recruitment technology can help address these challenges, decreasing total time to hire and overall hiring costs, and enable organizations to build more diverse leadership teams. It can also support diversity initiatives by focusing specifically on accomplishments and removing demographic and other personally-identifiable information that may lead to unconscious bias during the hiring process.

    M.R.: How can businesses effectively build their leadership pipelines given the Silver Tsunami, meaning the wave of Baby Boomer employees retiring in the coming years? 

    Sanjay: More than 11,000 Baby Boomers reach retirement age each day and more than 4.1 million Americans are expected to retire each year through 2027.

    Traditional executive search primarily focuses on serving organizations—not executives. Firms often wait for executives to reach out to them and the same executives are often passed around between companies, resulting in a limited talent pool. As an increasing number of executives retire as part of the Silver Tsunami, traditional candidate networks are becoming even smaller. 

    To improve talent sourcing across all roles amid the Silver Tsunami, organizations can turn to AI-powered candidate recruitment technology—rather than relying on personal connections. This approach enables organizations to be more proactive about succession planning by identifying and nurturing internal talent while simultaneously scouting for external candidates.

    A modern executive recruitment platform can support the growing and urgent need to fill executive roles as more workers retire, by enabling corporations to build diverse pipelines of qualified executives and reduce total hiring time to a matter of weeks, compared to four to six months with traditional executive search firms. 

    M.R. Rangaswami is the Co-Founder of

    Read More

    PitchBook’s 2024 Industrial Technology Outlook

    By Article

    What does 2024 hold for industrial tech? PitchBook’s latest Emerging Technology Research looks ahead to what could be in store for verticals like agtech, clean energy, and more.

    Here is a summary of Pitchbook’s Outlook on Agtech, the Internet of Things, Supply Chain Tech, Carbon & Emissions Tech, and Clean Energy.

    AGTECH: Autonomous farm robots will see a major increase in adoption.

    The anticipated surge in adoption of autonomous farm robotics in 2024 is driven by a convergence of compelling factors addressing critical challenges within the agriculture sector.

    First, the persistent global labor shortages in agriculture are pushing farmers to seek alternative solutions, with farm automation offering a viable response to mitigate the impact of diminishing workforce availability.

    Second, technological advancements, particularly in artificial intelligence, sensors, and automation, have matured to a point where the cost-effectiveness and reliability of robotic systems make them increasingly attractive for widespread adoption.

    Third, the imperative to optimize resource use, reduce operational costs, and enhance overall farm efficiency aligns seamlessly with the capabilities of modern farm robotics, positioning them as essential tools for a more sustainable and productive agricultural future.

    Fourth, the rise of Robotics-as-a-Service models is proving instrumental in easing upfront costs associated with adopting these technologies.

    Fifth, pilot studies have successfully demonstrated the effectiveness of farm robotics, and companies are now transitioning to full-scale commercialization, making 2024 a pivotal year for the integration of these technologies into mainstream agricultural operations.

    INTERNET OF THINGS: Outlook: Private 5G startups will produce a unicorn valuation in a late-stage deal or acquisition.

    Unicorn valuations have been rare in the Internet of Things (IoT) industry with only two VC deals for Dragos and EquipmentShare valuing companies over $1.0 billion in North America and Europe in 2023. 5G startups have not reached this threshold despite achieving rapid valuation growth for midstage companies and a $1 billion exit in the space in 2020 for Cradlepoint. Numerous technical and commercial barriers to entry will ease over the coming year and revenue growth is on pace to accelerate.

    The fundraising timelines of private leaders align with this trend, creating investment opportunities for growth-stage and corporate VC investors, along with telecommunications acquirers.

    SUPPLY CHAIN TECH: Drone deliveries will go commercial in the US with more funding and investor interest in the space.

    The Federal Aviation Administration (FAA) regulates the drone delivery market with a primary consideration on safety. To date, drones have been subject to a restriction called beyond visual line of sight (BVLOS) meaning an operator must have the drone within sight at all times when it is flying.

    This restriction represents a significant (some might say insurmountable) hurdle for the development of a drone delivery marketplace. The cost of an operator visually tracking and monitoring every delivery via drone is prohibitive.

    The FAA has stated that it wants to integrate drones into common airspace, and issued a number of exemptions to the BVLOS rule to startups and larger companies over the course of 2023.

    These exemptions open the door for the market to finally develop.

    CARBON & EMISSIONS TECH: Demand for carbon credits will recover, following uncertainty in 2022 and 2023.

    Voluntary carbon markets (VCMs) have been under significant scrutiny in recent years, particularly carbon credits based on avoidance—rather than removal—of emissions.

    Multiple different sets of standards, and the perceived risk associated with low-integrity credits, has been reducing the overall traded volumes of carbon credits, and has been pushing buyers toward removal-based credits that are easier to prove the integrity of.

    New independent standards are emerging, and while there are no obligations for credit providers to follow them, they provide the means to show high integrity and reassure buyers.

    CLEAN ENERGY: US clean hydrogen technology companies will become acquisition targets.

    Low-carbon hydrogen is seen as a key component of global decarbonization efforts, particularly for certain industrial applications and heavy transportation. Earlier this year, the US Department of Energy allocated $7 billion to a program to develop seven hydrogen hubs across the US, to produce, store, and distribute hydrogen.

    Companies involved in these hubs are varied, including energy and oil & gas companies that have experience with large-scale energy projects, but will likely look to close technology gaps through acquisitions.

    To read PitchBook’s full report, click here.

    PitchBook is a Morningstar company providing the most comprehensive, most accurate, and hard-to find data for professionals doing business in the private markets.

    Read More

    M.R. Asks 3 Questions: Jason Lu, CEO and Founder of CECOCECO

    By Article

    Jason Lu, the founder of CECOCECO, began his journey in the LED display industry in 2006 by creating ROE Visual. His commitment to perfection and a deep understanding of product quality quickly led to ROE Visual becoming a top brand within the industry.

    As an innovator in the field, Jason has consistently been a notable figure in the industry and is never content to rest on past achievements. In 2021 he sought new challenges and founded CECOCECO. With this venture, Jason embraced the idea that LED displays could be more than functional tools; they could integrate technology and aesthetics to create emotionally engaging experiences.

    Jason’s reputation for producing high-quality products is built on years of experience and industry knowledge. His dedication to product development was evident in the launch of ArtMorph by CECOCECO. After two years of dedicated work and maintaining high standards, Jason and his team successfully introduced this innovative product to the market.

    Under Jason’s leadership, CECOCECO is more than a brand; it’s a testament to ongoing innovation in how the world experiences and interacts with light and display technology.

    M.R. Rangaswami: What were the key insights or experiences that led you from ROE Visual to creating CECOCECO, and how do these past experiences shape your current vision?

    Jason Lu: I’ve come to recognize that traditional LED displays, while functional, are not universally applicable to every space and often clash with sophisticated designs. My ambition is to develop products that harmoniously blend functionality with aesthetic appeal. I firmly believe that innovation is fueled by pressure. ROE is currently experiencing stable growth, prompting me to initiate transformative changes.

    Reflecting on my past experiences, I’ve gained a profound understanding of the path to success and the attitude required for it. I’ve learned that success is not an overnight phenomenon. ROE took 17 years to reach its current stature, reinforcing my belief in the ‘slow and steady wins the race’ philosophy. I don’t equate financial gain with success. While survival is crucial, it’s not the epitome of success. My vision for CECOCECO is to relentlessly pursue excellence in our products, continuously innovate, and be a source of inspiration for the industry and the world at large.

    M.R.: How does CECOCECO innovate in the LED lighting and display industry, and what future advancements do you foresee in this space?

    Jason: At CECOCECO, our focus is on pioneering solution-based innovation. While similar products and projects exist, we question their viability and sustainability. Our approach involves crafting systematic solutions with an unwavering commitment to quality in every aspect, from the consistent output of our products to the intricacies of our manufacturing process. This is far more than a mere mechanical production; it necessitates a blend of human creativity and precision control. Our development and manufacturing stages demand extensive manpower, embodying a level of craftsmanship of the highest order. CECOCECO’s mission is to transform previously disjointed elements into cohesive, sustainable systems.

    Looking ahead, we aim to diversify our product range. This includes offering a wider variety of resolutions and shapes and innovating with flexible screen technologies. Our goal is to provide a more comprehensive and diverse range of solutions to meet the evolving needs of our customers.

    M.R.: What emerging trends in LED technology and lighting design do you find most exciting, and how is CECOCECO preparing to integrate these trends into future products?

    Jason: The landscape of LED lighting is undergoing two significant transformations. First, there’s a notable shift from point light sources to surface light sources, with Chip-On-Board (COB) technology gaining increasing popularity. This evolution marks a fundamental change in how we perceive and utilize LED lighting. Secondly, the realm of lighting design is witnessing a surge of creativity. It’s transcending beyond mere color shifts and overlays; dynamic, imaginative light effects are becoming the norm, adding a refreshing dimension to lighting.

    In response to these trends, CECOCECO is exploring integrating COB technology into our products to harness its unique effects. Lighting design isn’t just an aspect of our product; it’s a cornerstone. We’re committed to experimenting with various surface materials and designs to unlock new potential in creative lighting. Furthermore, we’re enthusiastic about collaborating with leading lighting designers. We aim to conceive and develop even more captivating lighting projects by merging our technological prowess with their creative flair.

    M.R. Rangaswami is the Co-Founder of

    Read More

    M.R. Asks 3 Questions: Ran Ronen, Co-Founder & CEO of Equally AI

    By Article

    According to Ran Ronen, 2024 will be the year in which technology leaders innovate by example to help create more inclusive experiences and broaden the base of potential users and customers of their technology services and solutions by prioritizing digital accessibility. 

    Accessible websites and online experiences offer businesses a range of benefits, from ensuring compliance with regulatory requirements and industry best practices, to more users and customers accessing the site, to improved website SEO and brand trust and credibility. Prior to advancements made possible with AI, the technical process of ensuring a website operates as accessible has been a difficult goal for many website owners to achieve due to challenges to manage end to end accessibility compliance. 

    Ran is the Co-Founder and CEO the world’s first no-code web accessibility solution designed to help businesses of all sizes meet regulatory compliance. This conversation was an enlightening one as he and I spoke about the positive shift he’s seeing in the tech field  to embrace more accessibility guidelines as best practices.

    He is the Co-Founder and CEO of Equally AI, the world’s first no-code web accessibility solution designed to help businesses of all sizes meet regulatory compliance.

    M.R. Rangaswami: What is the state of digital accessibility; and why, in today’s tech-driven world do you think adoption is still lagging to make accessibility a priority in user/customer experience? 

    Ran Ronen: The state of digital accessibility is evolving, yet its integration into mainstream tech remains slower than it should be. Although AI-driven accessibility tools are emerging, many companies still see accessibility as a complex and costly process, often overlooking or delaying it in favor of rapid development. This overlooks the opportunity to appeal to a wider, more diverse customer base and enhance product usability for everyone from the onset.

    Slow adoption also stems from limited awareness of diverse user needs and the wider benefits of accessibility beyond legal compliance. There’s a critical need for tech leaders to see accessibility not just as a necessity for individuals with disabilities, but as a key factor in improving overall user experience and innovation, which in turn boosts brand reputation and customer satisfaction.

    M.R.: What are some challenges faced by organizations in managing the technology implementation side of digital accessibility? 

    Ran: Organizations implementing digital accessibility often face several challenges, including a lack of in-house expertise on accessibility standards and implementation, which makes integrating these practices into existing tech frameworks difficult. Resource allocation is another challenge, as accessibility often competes with other business priorities and can be seen as an additional cost. Also, ensuring consistent accessibility across a diverse range of products and platforms presents a scalability challenge, requiring a strategic approach to meet various tech and user needs effectively.

    M.R.: As an innovator in the space, what is your hope for the impact of AI in making more companies and their offerings more digitally inclusive? 

    Ran: As an innovator in the digital accessibility space, my aspiration is that AI will enable a shift in perspective, where digital accessibility becomes not just an aspiration but a practical reality for more companies, especially small and medium-sized businesses. This will help them proactively create accessible products and services, which not only enhances the user experience for all but also opens up new markets and opportunities for innovation. 

    M.R. Rangaswami is the Co-Founder of

    Read More

    M.R. Asks 3 Questions: Ankit Sobti, Co-Founder and CTO of Postman

    By Article

    Ankit Sobti is co-founder and CTO for Postman, the world’s leading API Platform. Prior to joining Postman, Ankit worked for Adobe and Yahoo!, where he served as a senior software engineer. In his current role, Ankit focuses on product and development, leading the core technology group at Postman.

    A key focus for this Q&A are the findings from a recent global survey Ankit and the Postman team published, tracking the most important trends around API use in large enterprises.


    M.R. Rangaswami: APIs are critical tools for enterprise success, but should they also be considered products?

    Ankit Sobti: Thinking about APIs as products helps to understand and articulate that APIs, like any other item you’d typically call a product – a website, a mobile app, a physical product – are required to be built with a consumer-driven mindset. 

    This requires an understanding of who the consumers are, what problems are they trying to solve, why is it a problem in the first place, what else are they doing to solve this problem–and then consciously and deliberately designing a solution to this problem exposed through the interface of an API.

    And like any other product, APIs also need to be packaged, positioned, priced, distributed, and iteratively improved to evolving consumer needs. 

    Postman’s 2023 State of the API Report, which surveyed over 40,000 people found 60% of the API developers and professionals view their APIs as products – which I think is a good signal that this realization is well underway. And it makes sense that APIs are increasingly seen as products, serving both internal and external customers. 

    But how does this view vary by industry and company size? And how much revenue can APIs generate? It turns out that the larger the company, the likelier it is to view its APIs as products. At companies with over 5,000 developers, 68% of respondents said they considered their APIs to be products. At the other end of the spectrum were companies with fewer than 10 employees. There, just 49% of respondents viewed their APIs as products. 

    M.R.: Are APIs actual revenue generators now for companies?

    Ankit: Yes, APIs are increasingly unlocking new streams of revenue and business opportunities for companies. In some of the more traditional industries with lower margins for example, we are increasingly seeing APIs being used as a high margin revenue stream. And there are numerous examples now of companies where the primary product being sold is the API.

    APIs that package insights or key capabilities and can be used to drive strategic partnerships, or allowing companies to become platforms on top of which others can build. We are seeing examples of this ranging from small development shops all the way to large enterprises. 

    This is something we also saw in our survey, with 65% of the respondents affirming their APIs generate revenue, and almost 10% of companies with money-making APIs said their APIs generated more than three-fourths of total revenue. 

    M.R.: Does an API-first approach impact revenue?

    Ankit: API-first companies are defined as those that use APIs as the building blocks of their software strategy. APIs bind together not only the internal components of an organization, but also pave the way for seamless external collaboration. And thinking in terms of these building blocks, an API-first approach allows for easier externalization of the capabilities that APIs provide, and subsequently create easier paths for revenue.

    In addition, we believe that API-first companies have superpowers that foster happier developers and a healthier business ecosystem. In our customer base, we work with companies across a broad range of industries – and APIs generate significant amounts of revenue, unlock new business opportunities, and drive ecosystem expansion through partnerships.

    And for companies with APIs, it’s worth weighing how much to invest in them, and adopting an API-first approach. These decisions may have a tangible impact on the bottom line. 

    M.R. Rangaswami is the Co-Founder of

    Read More

    Innovate, Engage & Succeed: Embracing the PLG Paradigm – 2H 2023

    By Article

    Allied Advisers has just released its inaugural report on product led growth (PLG).

    Product-Led Growth (PLG) is an innovative customer-centric business strategy that employs user-friendly products to acquire, retain, and expand the customer base, reducing the reliance on traditional sales and marketing.

    As software users, we have had magical experiences with products that allow us to independently explore, test, purchase and expand usage without intervention from the product vendor’s sales team; these PLG strategies have been utilized successfully by leading SaaS companies such as Dropbox, Zoom, Klaviyo and Slack among others. This contrasts with sales led growth (SLG) that relies on direct sales teams to hunt and harvest product sales opportunities.

    This report covers insights on how to develop a PLG strategy from Dharshan Rangegowda, a former Allied Advisers client who grew ScaleGrid via a PLG strategy before raising a growth round with a mid-market PE firm.

    Additionally, the report provides details on transactions of PLG companies as well as profiles of certain PLG businesses in different verticals, indicating significant differences in operational efficiencies when adopting a PLG model.

    To read the full report, click here.

    Read More

    M.R. Asks 3 Questions: Godard Abel, CEO of G2

    By Article

    A 5x SaaS entrepreneur, Godard Abel is CEO of G2, the world’s largest and most trusted software marketplace, which he co-founded in 2012. He is also Executive Chairman of ThreeKit, a leading 3D visualization technology company, and, a next generation configuration technology.

    Previously, Godard served as CEO of SteelBrick which was acquired by Salesforce in 2016. Prior to SteelBrick, Godard co-founded BigMachines, where he served as CEO and built it into a leading SaaS provider which was acquired by Oracle in 2013. He also served as a GM at Niku prior to its IPO in 2000 (and subsequent acquisition by CA).

    Before entering the technology industry, Godard consulted for McKinsey & Company and advised leading manufacturers in the U.S. and Germany on strategy development and business process improvement. Godard was a Finalist for EY Entrepreneur of the Year in 2019, named to the Tech 50 list by Crain’s Business Chicago in September 2014, and to the Chicago Entrepreneur Hall of Fame in 2011. He earned an MBA from Stanford University and both a B.S. and M.S. in engineering from the Massachusetts Institute of Technology.

    As you can tell by our conversation, Godard is not only an innovator and leader in the tech world, but he is also very skilled at sharing a lot of information in few words.

    M.R. Rangaswami: How is software buying changing?

    Godard Abel: B2B buyers now expect consumer-like shopping experiences, where they can conduct research and make purchases quickly, conveniently, and on their own terms. This means expensive software solutions can be bought with a credit card, and the buyer conducts research on review sites and other peer communities. In fact, G2 research finds that 67% of global B2B software buyers usually engage a salesperson once they have already made a purchasing decision. 

    M.R.: How does AI impact this shift in software buying behavior? 

    AI will only accelerate the ongoing shift to self-serve software research and buying, delivering modern digital buyer experiences. The ability of AI to provide immediate, data-driven insights is a key driver of this shift. With this in mind, software vendors have an opportunity to lean into AI to meet buyers’ preferences for speed, eliminating friction in the software buying journey. 

    M.R.: What role does G2 play in this evolving software landscape? 

    Godard: G2 has over 2.4 million verified reviews on 150,000+ products and services. All 1 billion knowledge workers around the world need software and they’re coming to G2 to research it. With our massive dataset on B2B software and the most traffic from software buyers, G2 is uniquely positioned to power software buying and selling in the age of AI. 

    Earlier this year, we introduced Monty, the first-ever AI-powered software business assistant built on OpenAI’s ChatGPT. Previously, a buyer would visit and search for the type of software they were looking for – CRM, for example. However, not every buyer knows exactly what they need.

    With Monty, you can now describe the business challenge you’re looking to solve and have a conversation. Powered by G2’s extensive dataset, Monty can recommend the best software solutions for your particular need – making the process of research software faster, easier, and more effective.

    M.R. Rangaswami is the Co-Founder of

    Read More

    M.R. Asks 3 Questions: Jay Wolcott, Co-Founder & CEO, Knowbl

    By Article

    What does the future of customer experience look like with generative AI?

    According to Knowbl’s CEO and Co-Founder, Jay Wolcott, it’s going to critical to understand the risk in implementing AI solutions and the requirements for what “enterprise-ready conversational AI” means.

    In this conversation, Jay sheds light on how this innovative technology redefines customer experience, making interactions more seamless, convenient, and efficient.

    M.R. Rangaswami: What exactly is “BrandGPT,” and how does it differ from traditional conversational AI technologies? 

    Jay Wolcott: BrandGPT is a revolutionary Enterprise Platform for Conversational AI (CAI) built leveraging large language models (LLMs) from the ground up. Legacy virtual assistance platforms built upon BiLSTNs and RNN frameworks like the speed, ease, and scalability that LLMs can offer through few-shot learning. 

    Through the release of this all-new approach, CAI can finally meet its potential of creating an effortless self-service experience for consumers with brands. The proprietary AI approach Knowbl has designed within BrandGPT offers truly conversational and contextual interactions that restrict the limits of Generative AI from uncontrollable risks. 

    This new approach is driving tons of enterprise excitement for new levels of containment, deflection, and satisfaction across digital and telephony deployments. Beyond the improved recognition and conversational approach, Knowbl’s platform allows brands to launch quickly, leverage existing content, and improve the scalability of capabilities while reducing the technical effort to manage. 

    M.R.: What emerging trends do you foresee shaping the future of conversational AI and customer experience, and how can businesses prepare for these developments?

    Jay: In 2024 we plan to overcome customer frustration with brand bots and virtual assistants, ushering in a new era of effortless and conversational experiences powered by advanced language models.

    Brands that embrace LLMs for customer automation early on will establish a competitive advantage, while those who lag will struggle to keep up. Although many organizations are still in the experimental phase of using GenAI for internal purposes due to perceived risks, leading brands are boldly venturing into direct customer automation, reimagining digital interfaces with an “always-on” brand assistant.

    We also predict 2024 to be the year that bad bots die. New expectations of AI will lead to frustrated consumers when dealing with legacy bots, and a trend in attrition versus retention will appear.

    M.R.: What complexities do multinational companies face when implementing AI-driven solutions, and how can they navigate the challenges to ensure successful adoption across diverse markets?

    Jay: Multinational companies encounter a myriad of complexities when implementing AI-driven solutions stemming from the diversity of markets they operate. One significant challenge lies in reconciling varied regulatory landscapes and compliance requirements across different countries, necessitating a nuanced approach to AI implementation that adheres to local regulations. 

    Additionally, cultural and linguistic diversity poses a hurdle, as AI solutions must be tailored to resonate with the unique preferences and expectations of diverse consumer bases. To successfully navigate these challenges, companies must prioritize a robust localization strategy, customizing AI solutions to align with each market’s specific needs and cultural nuances. 

    Collaborating with local experts, remaining vigilant of regulatory changes, and fostering open communication with stakeholders is essential for multinational companies to achieve successful AI adoption across diverse markets.

    M.R. Rangaswami is the Co-Founder of

    Read More

    M.R. Asks 3 Questions: John Hayes, Founder & CEO, Ghost Autonomy

    By Article

    John Hayes is CEO and founder of autonomous vehicle software innovator Ghost Autonomy.

    Prior to Ghost, John founded Pure Storage, taking the company public (PSTG, $11 billion market cap) in 2015. As Pure’s chief architect, he harnessed the consumer industry’s transition to flash storage (including the iPhone and MacBook Air) to reimagine the enterprise data center inventing blazing fast flash storage solutions now run by the world’s largest cloud and ecommerce providers, financial and healthcare institutions, science and research organizations and governments.

    Like Pure, Ghost uses software to achieve near-perfect reliability and re-defines simplicity and efficiency with commodity consumer hardware. Ghost is headquartered in Mountain View with additional offices in Detroit, Dallas and Sydney. Investors including Mike Speiser at Sutter Hill Ventures, Keith Rabois at Founders Fund and Vinod Khosla at Khosla Ventures have invested $200 million in the company.

    Now, let’s get into it, shall we?

    M.R. Rangaswami: How does the expansion of LLMs to new multi-modal capabilities extend their application to new use cases?

    John Hayes:  Multi-modal large language models (MLLMs) can process, understand and draw conclusions from diverse inputs like video, images and sounds, expanding beyond simple text inputs and opening up an entirely new set of use cases from everything from medicine to legal to retail applications. Training GPT models on more and more application specific data will help improve them for their specific task. Fine-tuning will increase the quality of results, reduce the chances of hallucinations and provide usable, well-structured outputs.

    Specifically in the autonomous vehicle space, MLLMs have the potential power to reason about driving scenes holistically, combining perception and planning to generate deeper scene understanding and turn it into safe maneuver suggestions. The models offer a new way to add reasoning to navigate complex scenes or those never seen before.

    For example, construction zones have unusual components that can be difficult for simpler AI models to navigate — temporary lanes, people holding signs that change and complex negotiation with other road users. LLMs have shown to be able to process all of these variables in concert with human-like levels of reasoning.

    M.R.: How is this new expansion impacting autonomous driving, and what does it mean for the “autonomy stack” developed over the past 20 years?

    John:  I believe MLLMs present the opportunity to rethink the autonomy stack holistically. Today’s self-driving technologies have a fragility problem, struggling with the long tail of rare and unusual events. These systems are built “bottoms-up,” comprised of a combination of point AI networks and hand-written driving software logic to perform the various tasks of perception, sensor fusion, drive planning and drive execution – all atop a complicated stack of sensors, maps and compute.

    This approach has led to an intractable “long tail” problem – where every unique situation discovered on the road requires a new special purpose model and software integration, which only makes the total system more complex and fragile. With the current autonomous systems, when the scene becomes overly complex to the point that the in-car AI can no longer safely drive, the car must “fall-back” – either to remote drivers in a call center or by alerting the in-car driver. 

    MLLMs present the opportunity to solve these issues with a “top-down” approach by using a model that is broadly trained on the world’s knowledge and then optimized to execute the driving task. This adds complex reasoning without adding software complexity – one large model simply adds the right driving logic to the existing system for thousands (or millions) of edge cases.

    There are challenges implementing this type of system today, as the current MLLMs are too large to run on embedded in-car processors. One solution is a hybrid architecture, where the large-scale MLLMs running in the cloud collaborate with specially trained models running in-car, splitting the autonomy task and the long-term versus short-term planning between car and cloud.

    M.R.: What’s the biggest hurdle to overcome in bringing these new, powerful forms of AI into our everyday lives?

    John: For many use cases, the current performance of these models is already there for broad commercialization. However, some of the most important use cases for AI – from medicine to legal work to autonomous driving – have an extremely high bar for commercial acceptance. In short, your calendar can be wrong, but your driver or doctor can not. 

    We need significant improvements on reliability and performance (especially speed) to realize the full potential of this technology. This is exactly why there is a market for application-specific companies doing research and development on these general models. Making them work quickly and reliably for specific applications takes a lot of domain-specific training data and expertise. 

    Fine-tuning models for specific applications has already proven to work well in the text-based LLMs, and I expect this exact same thing will happen with MLLMs. I think companies like Ghost, who have lots of training data and a deep understanding of the application, will dramatically improve upon the existing general models. The general models themselves will also improve over time. 

    What is most exciting about this field is the trajectory — the amount of investment and rate of improvement is astonishing — we are going to see some incredible advances in the coming months.

    M.R. Rangaswami is the Co-Founder of


    Read More

    M.R. Asks 3 Questions: Gerry Fan, CEO of XConn Technologies

    By Article

    Gerry Fan serves as the Chief Executive Officer at XConn Technologies, a company at the forefront of innovation in next-generation interconnect technology tailored for high-performance computing and AI applications.

    Established in 2020 by a team of seasoned experts in memory and processing, XConn is dedicated to making Compute Express Link™ (CXL™), an industry-endorsed Cache-Coherent Interconnect for Processors, Memory Expansion, and Accelerators, accessible to a broader market.

    In pursuit of expediting the adoption of CXL, Gerry and his teams have successfully introduced the world’s inaugural hybrid CXL and PCIe switch – with a strategic approach that will make computers faster, smarter, and better for the environment.

    M.R. Rangaswami: What barriers are being faced by AI and HPC applications that you are looking to

    Gerry Fan: Next generation applications for artificial intelligence (AI) and high-performance computing
    (HPC) continue to face memory limitations. The exponential demand these applications place on memory bandwidth has become a barrier to their further innovation and widespread adoption.

    The CXL specification has been developed to alleviate this challenge by offering unprecedented memory capacity and bandwidth so that critical applications, such as research for drug discovery, climate modeling or natural language processing, can be delivered without memory constraints. By applying CXL technology to break through the memory bottleneck, XConn is helping to advance next-generation applications where a universal interface can allow CPUs, GPUs, DPUs, FPGAs and other accelerators to share memory seamlessly.

    M.R.: How are you looking to solve the challenge with the industry’s first and only hybrid CXL
    and PCIe switch?

    Gerry: While CXL technology is poised to alleviate memory barriers in AI and HPC, a hybrid approach that combines CXL and PCIe on a single switch provides a more seamless pathway to CXL adoption. PCIe (Peripheral Component Interconnect Express) is a widely used interface for connecting hardware components, including GPUs and storage devices. Many traditional applications only need the interconnect capability offered by PCIe. Yet, emergingly, next-generation applications need the higher bandwidth enabled by CXL. System designers can be stuck with what approach will be the highest need.

    XConn is meeting this challenge by offering the industry’s first and only hybrid CXL 2.0 and PCIe Gen 5 switch. Combining both interconnect technologies on a single 256-lane SoC, the XConn switch is able to offer the industry’s lowest port-to-port latency and lowest power consumption per port in a single chip – all at a low total cost of ownership. What’s more, system designers only have to design once to achieve versatile expansion, heterogeneous integration for a mix of accelerators, and fault tolerance with the redundancy mission critical applications require for true processing availability.

    M.R.: In your view, how will XConn revolutionize the future of high-performance computing and AI

    Gerry: Together with other leading CXL ecosystem players, XConn is delivering on CXL’s promise to support faster, more agile AI processing. This will deliver the performance gains AI and HPC applications
    needs to accelerate research and innovation breakthroughs. It will also support greater energy efficiency and sustainability while helping to proliferate the “AI Everywhere” paradigm for smarter and more autonomous systems.

    By helping to foster innovation and accelerate application use cases, XConn is delivering the missing link that will pave the way for unprecedented computing performance needed for tomorrow’s breakthroughs and technology advancements.

    M.R. Rangaswami is the Co-Founder of

    Read More

    M.R. Asks 3 Questions: Razat Gaurav, CEO of Planview

    By Article

    When I sat with Razat he was clear on the imperativeness of the digitalisation in almost every organisation in every industry today, and that is what is leading to more than $3trn of annual spending on it.

    His rationale behind digitalisation is sound but as he shared, studies show that much of that work is wasted-more than 40%, in some cases. This is largely due to the disconnect between strategy and what’s being executed by teams across the business.

    As the leader in portfolio management and value stream management, this conversation Razat Gaurave shares why bridging the strategy-execution gap is essential for organizational and leadership transformation.

    Would you believe that 40% of strategy work gets wasted in execution?

    M.R. Rangaswami: What is the biggest challenge orgs face when connecting strategy to execution?

    Razat Gaurav: The biggest challenge between strategy and execution is change-change from technology shifts, demographic shifts, and even generational shifts. It’s not a new phenomenon. But what has changed is that the pace of change is exponentially faster. Companies must be able to quickly analyse and adapt or evolve their strategy-and how those changes are executed-while still driving important business outcomes.

    M.R. The research arm of The Economist, found 86% of executives think their organizations need to improve accountability for strategy implementation. What challenges do orgs face around measurement?

    Razat: The key thing that gets in the way are data silos. Most organisations are swimming in data, yet most of that data is not usable to make decisions. Curating the relevant data to align with your priorities and objectives is critical to achieving accountability for strategy implementation.

    What we find is that many organisations have three major gaps when they look at how they measure understanding of strategic goals.

    First, organisations are measuring inputs or outputs, but they’re not measuring outcomes. Particularly when dealing with digital transformations, the business and technology teams must work together to focus on the outcome.

    The second gap is around creating a synchronised, connected approach to objectives and key results-what some organisations call OKRs. Is leadership in alignment with the way an individual contributor gets measured? And does the individual contributor understand how they impact their leadership’s OKRs? That bidirectional synchronisation is key

    And then the last piece is how the different functions in the organisation-finance, manufacturing, sales, and so on-align their OKRs to help achieve the company’s objectives and key results.

    M.R.: What should leaders do first to narrow the strategy-execution gap?

    Razat: My first piece of advice would be, take a deep breath because change is constant.

    As organisations, as leaders, as individuais, we all have to be ready to adapt and change. But beyond taking that deep breath, there are three things I’d advise organisations to do.

    First, figure out the three initiatives that wil actually move the needle. Second, define OKRs and an incentive structure for the outcomes you’re trying to achieve. Third, invest in systems that allow you to break out of those data silos to execute as one organisation, as one team.

    M.R. Rangaswami is the Co-Founder of

    Read More

    SMB SaaS: The Younger and Sometimes Overlooked Sibling of Enterprise SaaS

    By Reports

    According to the recent update from Allied Advisers, SMB is the backbone of the US economy; 99.9% of all US businesses are in this segment. With rising SaaS adoption by small businesses for enhancing productivity, we remain optimistic on the long-term view of this sector.

    While not surprisingly, SMB SaaS has higher churn than Enterprise SaaS, SMB SaaS has significantly better operational metrics when it comes to sales and marketing expense, R&D expenses, EBITDA margins and less sector competition. Our report covers nuances of SMB SaaS and we believe that SMB SaaS businesses continue to offer compelling opportunities for investors and buyers.

    This particular Allied Advisers report has updated their SMB SaaS, highlighting a sector that has been growing with notable outcomes.

    The report pulls from IPO’s of: Freshworks ($1.03B), Digital Ocean ($780M), Klaviyo ($576M) and notable exits such as Mailchimp’s acquisition ($12B+, one of the largest boot-strapped exits) and growth of private SMB SaaS companies like Calendly (last valued at $3B), Notion (last valued at $10B).

    To see the full summary of Allied Adviser’s update, click here:

    Gaurav Bhasin is the Managing Director of Allied Advisers.

    Read More

    State of SaaS M&A: 4 Buyers’ Perspectives

    By Article

    One year ago, Software Equity Group started their 2022 report on M&A trends with a simple observation: the stock market activity was not for the faint of heart. That view led to a much broader inquiry throughout the report into the myriad of dynamics at play and the associated impact on the software M&A market.

    So how are Founders and CEOs exercising caution when considering M&A and liquidity events in the face of ongoing economic uncertainty, and is their restraint warranted?

    To cut to the chase: it depends. For software businesses with the right profile (more on that later), there is tremendous opportunity in the current M&A landscape.

    To better assess the state of the market, SEG analyzed data from our annual survey of CEOs, private equity investors, and strategic buyers, in addition to our quarterly report and our transactions.


    1. Cautious CEOs Are Holding Off On Going To Market

      Not surprisingly, the macroeconomic environment has colored their perceptions of the SaaS M&A market. Seventy-eight percent believe valuations are the same or lower than last year, and over two-thirds believe the market will improve in the coming years.

      As a result, many are waiting to explore and see what the future holds before going to market.

      2. Buyers And Investors Face Shortage Of Opportunities

      In contrast to the CEOs’ viewpoint, buyers and investors are finding that the competition is holding steady or getting stronger. They are eager to do deals with high-quality businesses, but there are not as many opportunities available as in 2022.

      Meanwhile, 66.7% of strategics say they have seen no change or a decrease in the volume of high-quality SaaS companies in the market over the past year. This supports the idea that high-quality M&A opportunities are scarce in 2023 and high-quality businesses that pursue a liquidity event receive outsized interest from buyers and investors.

      3. Growth, Retention & Profitability Are Key

      Given the uncertainty in the macro markets over the last 18 months, it is not surprising that buyers have become more risk-averse, and the profile of a highly desirable asset has shifted.

      Nevertheless, while revenue growth and retention are weighted strongly, there is little interest in businesses burning significant cash. In 2020 and 2021, the high-burn, growth-at-all-cost model was considered an attractive asset. In 2023, the story has now changed.

      4. High-Quality Assets Are Demanding Premium Valuations

      The current market represents a classic supply and demand dynamic. When the supply of a good decreases, and the demand for said good stays the same or increases, its price is expected to increase.

      Where is the data that supports it?

      The answer is hard to find in the public markets. The share prices of public SaaS companies in the SEG SaaS Index have rebounded this year but are still down roughly 36% from COVID-level peaks.

      The Nasdaq has sharply rebounded from 2022 lows, due to the “Magnificent 7” companies and excitement over artificial intelligence. Most notably, valuations in M&A deals have decreased by 36%
      since 2021.

      There Is Good News For SaaS Companies.

      It is easy to understand why CEOs are cautious right now, and many are right to be. The landscape has shifted from where it was a few years ago, with buyer and investor priorities shifting as well. It is clear, however, that the deficit of profitably growing assets on the market is working in favor of sellers.

      This is due to increasing competition for highly sought-after software companies that display strong revenue growth and retention. One thing everyone agrees on: higher valuations lie ahead.

      To read the full SEG review on SaaS M&A: 4 Buyers’ Perspectives, click here.

      Read More

      Status Check: 5 Early Predictions for 2023

      By Article

      In January 2023 Leigh Segall, Chief Strategy Officer at Smart Communications – a leading technology company focused on helping businesses engage in more meaningful customer conversations, shared her predictions on what businesses would be focusing on with customer experience in 2023.

      We’ve kept these in our back pocket knowing that as we round out Q4, it would be useful to reflect and review where customer experience strategies are currently at in this climate.

      1. Ever-changing customer behaviors will require enterprises to reimagine existing business models

      The accelerated shift to digital that was originally driven by the global pandemic has consumers expecting total digital freedom, with the ability to choose when, where and how they interact with brands across many industries.

      Even those who were slow to adopt digital are now on board — which means businesses must adapt, not just to meet today’s expectations but also to prepare for the changes tomorrow may bring. Analysts and experts agree that businesses must focus on customer-centricity — particularly industries that have lagged in moving to digital. And they can show that they care by focusing less on one-way transactions and more on two-way customer conversations that drive trust and loyalty, and provide value. 

      2. Conversational experiences will make or break brand loyalty and customer trust

      Consumers and businesses alike are overwhelmed with choice, making competition for attention and loyalty fiercer than ever. Add ongoing instability to the equation, and cultivating trust becomes the key to fostering lasting customer relationships.

      Earning customer trust is especially challenging for industries that deal with emotionally-charged matters — such as money, health, and property loss or damage. Businesses addressing these needs should cultivate a tech ecosystem that’s interconnected and interoperable, pulling together data and processes from multiple systems of record to create easy, efficient conversations that are both sophisticated and seamless. 

      3. Enterprises will automate and digitize key business processes to increase operational efficiency

      The pandemic-accelerated pace of digital transformation has led to an IT skills shortage that’s being felt globally. And many businesses are looking to low-code solutions to reduce the burden on IT and increase operational efficiency by empowering non-technical business users.

      Shifting the mindset away from maintenance paves the path for future success by freeing IT teams from routine and repetitive tasks, allowing them to focus on more strategic initiatives. Cloud-based solutions also reduce total cost of ownership (TCO) and technical debt while bringing much needed resilience. Cultivating a tech ecosystem that brings agility and flexibility at scale will be critical to increasing operational efficiency without impacting customer experience. 

      4. Enterprises will mitigate risks and protect brand reputation by increasing the focus on compliance and regulatory requirements

      Continuing cyberthreats are creating an increased need for business leaders to focus on compliance and regulatory requirements, which are constantly evolving — particularly for highly-regulated industries such as financial services, healthcare and insurance.

      Adopting a cloud-first approach will enable highly-regulated organizations to greatly reduce risks and keep up with ever-changing regulatory requirements — which will continue to evolve in 2023 and beyond. Investing in the right tech partners enables deep visibility into the nuanced requirements of each industry, with the ability to easily make sweeping updates as the rules of engagement change. Layering on automated, digitized solutions helps to ensure communications are compliant across all customer touchpoints; legacy systems simply aren’t up to the task. 

      5. Technological innovation will remain a top priority as enterprises recognize the increased need for agility and scalability

      Business leaders know that speed and scale are mission critical. As global markets become more interconnected and waves of change continue to rise, enterprises must be able to adapt on the fly — and at massive scale. This calls for replacing legacy systems and processes with sophisticated, cloud-first solutions that enable data interconnectivity, operational efficiency and enterprise-wide flexibility.

      As customer expectations continue to evolve, businesses need to be able to access and act on customer data and deliver personalized, unique customer interactions at every touchpoint. 

      We’d love to hear your thoughts — so please send us an email!

      Read More

      Ashu Garg: 3 Takeaways from the Generative AI “Unconference”

      By Article

      As General Partner at Foundation Capital, Ashu Garb collaborates with startups throughout the enterprise stack. His career is reflective of his enthusiasm for machine learning and revolutionizing established software domains to create fresh consumer interactions.

      While FC’s inaugural Generative AI “Unconference” was held back in June, we still find ourselves referencing Ashu’s observations from the conference. We hope you take away as much from his highlights as we have.

      1. AI natives have key advantages over AI incumbents

      In AI, as in other technology waves, every aspiring founder (and investor!) wants to know: Will incumbents acquire innovation before startups can acquire distribution? Incumbents benefit from scale, distribution, and data; startups can counter with business model innovation, agility, and speed—which, with today’s supersonic pace of product evolution, may prove more strategic than ever.

      To win, startups will have to lean into their strength of quickly experimenting and shipping. Other strategies for startups include focusing on a specific vertical, building network effects, and bootstrapping data moats, which can deepen over time through product usage.

      2. In AI, the old rules of building software applications still apply

      How can builders add value around foundation models? Does the value lie in domain-specific data and customizations? Does it accrue through the product experience and serving logic built around the model? Are there other insertion points that founders should consider?

      While foundation models will likely commoditize in the future, for now, model choice matters. From there, an AI product’s value depends on the architecture that developers build around that model. This includes technical decisions like prompts (including how their outputs are chained to both each other and external systems and tools), embeddings and their storage and retrieval mechanisms, context window management, and intuitive UX design that guides users in their product journeys.

      3. Small is the new big

      Bigger models and more data have long been the go-to ingredients for advancements in AI. Yet, as our second keynote speaker, Sean Lie, Founder and Chief Hardware Architect at Cerebras, relayed, we’re nearing a point of diminishing returns for simply supersizing models. Beyond a certain threshold, more parameters do not necessarily equate to better performance. Giant models waste valuable computational resources, causing costs for training and use to skyrocket.

      To read Ashu’s full report, and his Top 5 Takeaways, click here.

      Read More

      M.R. Asks 3 Questions: Colin Campbell, Author

      By Article

      Roughly 20% of new businesses fail within the first year, and 50% are gone within five years

      So what makes a startup successful? Is it mainly a combination of hard work and luck, or is there a winning formula?

      Colin C. Campbell has been a serial entrepreneur for over 30 years. He has founded and scaled various internet companies that collectively have reached a valuation of almost $1 billion. In his new book, Start. Scale. Exit. Repeat.: Serial Entrepreneurs’ Secrets Revealed! Colin shares a wealth of experience, with an in-depth guide featuring interviews with industry experts and points readers in the right direction on their entrepreneurial journey to help answer the questions they’ll encounter.

      M.R. Rangaswami: What is it about what you share in Start. Scale. Exit. Repeat.: Serial Entrepreneurs’ Secrets Revealed! that you feel hasn’t been shared before?

      Colin Campbell: Start. Scale. Exit. Repeat. represents 30 years of my experience as a serial entrepreneur, a decade of research and writing, and over 200 interviews with experts, authors, and fellow serial entrepreneurs. The book deconstructs the stages of building a company from inception to exit, and lays out strategies to replicate this success repeatedly.

      At each stage of a company’s life cycle, it’s crucial to fine-tune your narrative, assemble the right team, secure adequate funding, and put in place effective systems. The strategies for achieving these vary dramatically, from the chaotic, founder-centric startup phase to the more structured approach needed to scale. As you near the finish line, your strategy will have to pivot once again.

      The core message of Start. Scale. Exit. Repeat. is that entrepreneurship isn’t a “one and done” affair. It’s a skill—akin to any other trade—that you can master and continually refine. There’s a recipe for launching a successful startup, and this book simplifies it into actionable steps to be taken one at a time.

      Furthermore, the book challenges the prevailing obsession with unicorns. We exist in a “unicorn culture,” where a valuation under a billion dollars is often frowned upon. But this mindset is perilous. The high-velocity chase for unicorn status has led to a wreckage of dreams and fortunes along the Silicon Valley highway. I’ve witnessed countless founders succumb to this “Silicon Valley disease,” sacrificing years of labor and significant capital.

      There’s a more pragmatic approach to building wealth, and it’s far simpler: start, scale, exit, take some money off the table, and repeat.

      M.R.: What was your biggest lesson from one of your biggest setbacks?

      Colin: Let’s take a trip down memory lane to the early ’90s. My brother and I launched an Internet Service Provider (ISP) in Canada. We were pioneers on the “Information Superhighway,” connecting hundreds of thousands of Canadians to the internet. We found ourselves in the whirlwind Geoffrey Moore famously described as the “Tornado.” It was an exhilarating ride, especially for a couple of 20-somethings who had grown up on a farm.

      We took the company public later in the ’90s and merged it with a wireless cable company, closing at a valuation of approximately $180 million. After receiving 50% of a wireless spectrum for fixed wireless internet from the Canadian government—yes, they handed out spectrum back then to encourage competition—our company’s valuation skyrocketed to over $1 billion. Technically, it was a stock-for-stock swap, with our shares being locked up for 18 months. At 28 years old in 1998, I owned almost 14% of the company.

      We thought we were invincible. The internet was poised to change everything, and we were on the forefront. 

      Then, out of nowhere, the .COM crash hit. 

      Our company pulled its secondary offering to raise $50 million because the Nasdaq had tanked to 4,000. And it kept falling, plummeting to 1,300 and not recovering for over a decade. It was indeed the .COM crash, and the music had stopped—without enough chairs to go around.

      Did we make mistakes? Absolutely. We shouldn’t have relinquished control without securing liquidity. “Liquidity or control” has since become our mantra for all future ventures. And let’s face it—stuff happens. Technologies evolve, regulations change, and market climates shift. That’s why it’s crucial to exit when times are good. When the party’s in full swing, make a discreet exit, take some money off the table, and focus on your next venture.

      As for that unicorn of ours? It filed for bankruptcy protection, and our stock plummeted from a high of $19 a share to the paltry sum I sold it for: 6 cents a share.

      Thankfully, we regrouped and stuck to our strengths. We launched Hostopia, a global leader in hosting and email solutions for telecoms. We took it public and eventually sold it to a Fortune 500 company—this time for an all-cash deal—just a month before the Lehman crisis in 2008.

      M.R.: In your experience, once a business is past the first 5 years of failing, what’s the next riskiest precipice they encounter?

      Colin: The vast majority of companies in America are small businesses, and most struggle to scale. But make no mistake—there’s a formula for scaling your enterprise. Some companies might find it more challenging than others, and some may opt out due to the stress and transformative changes that come with scaling.

      In the SaaS (Software as a Service) industry: if you’re not growing, you’re dying. After the .COM crash, we found ourselves running low on funds while operating our hosting and email platform. Still, we remained optimistic. Why? Because even though we were bleeding $500,000 per month, our customer base was growing. Growth is the lifeline in SaaS; losing money is acceptable as long as you’re expanding.

      Hostopia, for example, adhered to the Rule of 40, maintaining a growth rate plus profit margin that exceeded 40%. We achieved 32 consecutive quarters of growth, leading to an IPO and ultimately a successful sale at a 60% premium over our trading price to a Fortune 500 company. Another venture, .CLUB Domains, also operated in the red for several years. Nevertheless, we managed to cut losses by about half a million dollars annually until we started adding the same amount to our bottom line, culminating in an exit to GoDaddy Registry.

      Am I a genius entrepreneur? As much as I’d like to think so, that’s far from the truth. In 2005, our company was facing internal strife, stalled sales, and a board questioning my role as CEO. One board member even remarked, “He’s too young and way in over his head.” That’s when a friend introduced me to Patrick Thean, a coach at Rhythm Systems. Patrick taught us invaluable systems like goal setting, strategic planning, daily huddles, and KPI tracking. In addition, we partnered with other coaches to transform the organization from a tech-centric company to a sales driven organization. The ultimate effect of all of these changes: we tripled our size within a few years.

      Since then, we incorporated these systems along with countless other insights I’ve gathered from serial entrepreneurs, experts, and authors. We’ve encapsulated these stories and lessons in the book, laying out a clear roadmap for SaaS companies aiming to scale.

      M.R. Rangaswami is the Co-Founder of

      Read More

      A Quick Q&A with David Luke, Global Practice Leader at Consulting Solutions

      By Article

      Organizational Optimization is what gives David Luke’s career credibility.

      In this quick Q&A, David shares his insights on the major staffing and retention challenges tech leaders are facing and how IT teams can accelerate their approaches to innovation to stay competitive.

      M.R. Rangaswami: What kind of staffing and retention challenges are IT leaders facing right now?

      David Luke: IT leaders are experiencing a new phenomenon in today’s professionals: an influx of talent that is demanding to work in non-traditional ways. HR departments are finding it difficult to create a standard job class or role category. Executives and line managers alike are turning to firms like Consulting Solutions for a la carte solutions to address anti-patterns that are impeding their business.

      Here are what I believe to be the top five challenges in our current labor market:

      1. Creating a safe space for employees where they can land, grow, and learn while delivering both innovative and traditional pieces of work. By partnering with HR and recruiting firms, leaders can develop a place where folks want to work, are able to grow their career to the level that they desire, and develop their knowledge / skills with a defined path forward.
      2. Attracting people who are late career that bring knowledge and maturity to an organization. These are the gems in our workforce that can not only deliver with speed but also mentor new professionals in the workforce.
      3. The ability to balance a lower-cost delivery with a world-class product and retaining those people that deliver that product.
      4. The decision between remote and on-site, which means ensuring that you are getting the talent that will accelerate your business by offering options for your people. There is some exceptional talent out there who would love to work remotely, and then there are also folks who thrive in an in person collaborative environment. Leaders need to weigh how they want their workforce to be shaped and potentially develop a blend.
      1. Although it’s an attractive practice, leaders need to understand some of the limitations of nearshoring/offshoring their workforce—fewer overlapped hours, decreased team retention due to offshore labor practices, and collaboration on a limited basis. Ensure that you weigh the cost savings versus delivering an exceptional product.

      M.R.: Why is the “product owner role” so critical to delivery team success? 
      David Luke: Exceptional product owners use their superpowers to bring the product vision down to the team level. They focus relentlessly on prioritizing what is needed and what is wanted for their business, their stakeholders, and their customers. The best product owners can strike the right balance between being specific enough to provide clear direction to the team while still being flexible enough to accommodate changes and shifts in priorities that come from a deep and dynamic partnership with product managers.
      These proverbial unicorns also have a deep knowledge of user needs and the experience that the business wants the customer to receive. They easily see the bigger picture and engage often with product managers, customer experience, and user-experience experts to define and drive the delivery of great products. 
      Elite product owners have an abundance of empathy in their toolkits. They’re able to read the pulse of the team, the customer, and stakeholders while balancing the push and pull to deliver great products.
      What sets apart the truly outstanding product owners is the ability to effectively listen. Not just to the words but to the underlying messages and sentiments of everyone who they actively seek to communicate with as part of their rituals, ceremonies, and workdays. 
      Great product owners don’t just look inward; they excel at looking outward to the market, the competition, and the changing technologies that they work with every day. They know the goals and challenges and can articulate the path forward to lead their teams and their products to successful outcomes. They are storytellers, evangelists, and cheerleaders for their teams and their products. The word on the chest of their superhero suit is often “TEAM”. 

      M.R.: If technology is evolving faster than workplace structures can keep up, what must IT teams do to accelerate their approach to stay competitive and deliver results? 
      David: At the heart of any change to approach, regardless of its scope, lies the critical support of leadership. While grassroots efforts can certainly achieve success, a unified message and commitment from the top sets the tone for the entire organization.

      To ensure an accelerated approach, it is also essential to establish governance and a defined way of working, while remaining open to adjusting these as you gain a deeper understanding of your company’s culture. With these foundational elements in place, you can then develop charters and set clear, measurable objectives and key results (OKRs) to guide your progress toward success. And most importantly, START THE WORK! Don’t get bogged down in planning—act and stay focused on delivering results. 

      Once you have established a new, accelerated way of working, you must set about to streamline your efforts and prioritize the things that are most important to your customers. Use your product owners, UX experts, and CX experts to gain the trust and pulse of your customers, as they are who you are building for, and they will tell you if you are getting it right. Leverage new practices such as design thinking to understand who you are building for, what their pains are, and how you can deliver products to eliminate or alleviate those pains.

      M.R. Rangaswami is the Co-Founder of

      Read More

      Navigating M&A in Uncertain Markets: 2H 2023 Update

      By Reports

      The question of “are the market conditions right” remains in the minds of investors and executives interested in exploring M&A. We address this question by sharing our perspectives on how to achieve a successful M&A outcome.

      Our recommendations are based on Allied Advisers’ deep experience in advising clients on their exit to both Fortune-500 and mid-market strategic buyers, as well as a diversity of PE funds. In the last 12 months, we advised clients on their exit to: Activision Blizzard King, Walmart, Dura Software, PSG Equity and Virtana among others.

      While 2010-2021 were robust years for M&A and capital raises for technology companies, the markets today have changed significantly in terms of deal volume and valuation though we are seeing improvements to a more rationale and sustainable market.

      With the major indices rebounding this year from the lows of 2022; the question of “are the market conditions right” still remains in the minds of investors and executives interested in exploring M&A.

      This article covers some of the M&A trends including that private equity (PE) continues to be major driver of deal volume, there have been new technology M&A buyers among larger private companies, and we are seeing stabilization of deal volume and value.

      Also, the impending IPO of Arm (Semiconductor), Klaviyo (Software) and Instacart (Internet) not only provide a litmus test about what private companies are worth in public markets but also create currency, potentially opening the door for them and a slew of other companies for future IPOs and M&A.

      We at Allied Advisers are also sharing our own observations and our perspectives on how to achieve a successful M&A outcome in the current environment. In the last 12 months, we advised clients on their exit to: Activision Blizzard King, the world’s largest game network and a Fortune 500 company; Walmart, a Fortune One company; Dura Software, a software consolidator; PSG Equity, a top tier PE fund ($22.1B AUM); and Virtana, a growing PE backed company among others.

      Below is the full report from Allied Advisers:

      Gaurav Bhasin is the Managing Director at Allied Advisers.

      Read More

      A Quick Q&A with Jonathan Tomek, Vice President of R&D at Digital Element

      By Article

      This conversation is ahead of Cyber Security month, and sharing what information is available for our network of tech leaders and the cyber security solutions available to them.

      Johnathan Tomek is a VP at Digital Element, a global IP geolocation and intelligence leader for over 20 years. There, he is a seasoned threat intelligence researcher with a background of network forensics, incident handling, malware analysis, and many other technology skills. Previously, Jonathan served as CEO of MadX LLC, Head of Threat Intelligence with White Ops, and Director of Threat Research with LookingGlass Cyber Solutions, Inc.

      In this Q&A Jonathan shares the challenges that many of the world’s largest websites, brands, security companies, ad networks, social media platforms and mobile publishers face–and the best practices his team takes to combat online fraud.

      M.R. Rangaswami: With the rise of VPNs and residential proxy IP networks, many corporate security teams seem to struggle to see who accessing their networks and data. How should they
      approach security as these trends accelerate?

      Jonathan Tomek: IP address intelligence data can help security teams hone their best practices for establishing rules for who can access their network. For instance, IP address data reveals a great deal about masked traffic, such as whether it is coming from a VPN, darknet or residential IP proxy. With this knowledge, security teams can opt to block all darknet traffic automatically.

      Likewise, knowing that many people use residential IP proxies to scrape websites for competitive research, security professionals can opt to block all residential IP proxies.

      The important factor here is context. A company may not be concerned about VPN traffic in general, but if thousands of failed login attempts from a specific VPN over a short time period are observed, this would be indicative of an individual threat versus many unknown attacks.

      Digital Element also knows a great deal about the VPN market, including which providers offer features that enable nefarious players to hide their activities.

      That insight can be used to set access policies based on the VPN provider. For instance, you may want, as a matter of policy, to block all traffic that stems from VPNs that are free, or accept crypto payment and allow no-logging behavior as an option, as they are features that allow bad actors to cover their tracks.

      Though some believe blocking is a common theme, the context provided can be more importan at times, especially after an incident by helping to understand characteristics of the threat and narrow down the area of focus.

      M.R. Requesting additional authentication is a safe, but costly, practice. How can IP address
      intelligence data help security teams drive efficiency in its access policy?

      Jonathan Tomek: Asking for additional authentication is a good security measure, but it does require additional computing power, which isn’t free. It also affects the user experience, especially when a loyal customer signs into a system frequently.

      IP address intelligence data is useful here, both in helping networks save resources, and ensuring a more seamless user experience. Such insights include IP stability, which tells us how long a specific IP address has been observed at a specific location.

      If a customer signs into your network every day via the same IP address observed at the same geolocation, there may be no need to request a second authentication. But if one day that user attempts to sign-in from an IP address from a geolocation on the other side of the country, or from a more local region but is also a VPN, it would be a good idea to validate them.

      IP address intelligence data can provide context to help security teams set policies that prioritize when to request additional authentication.

      M.R.: How can IP Intelligence data help security teams understand how a breach occurred, and
      to minimize any damage done?

      Jonathan Tomek: That’s a great question. Every security professional understands that, try as you might, it is simply impossible to prevent a breach.

      The best approach is to be able to respond quickly and minimize the impact in the event of a breach. IP address intelligence is critical to add to a security information and event management solution (SIEM).

      By leveraging IP intelligence, you have additional data points which can help reduce false positive alerts, while also refining other alerts for investigators.

      The ability to cluster events is a huge timesaver. If a specific VPN was used during a breach, you could find related IP addresses and see how the attacker was attempting to gain entry to your infrastructure, helping you with the timeline.

      M.R. Rangaswami is the Co-Founder of

      Read More

      SEG Snapshot: 2Q23’s SaaS M&A and Public Market Report

      By Reports

      Software Equity Group’s quarterly report is in, they’re revealing an improved outlook across the broader macroeconomy, industry excitement around AI, and overall investor optimism for growth businesses contributed to a solid first half for publicly traded B2B SaaS companies.

      Meanwhile, continued strategic buyer and private equity interest has resulted in strong M&A outcomes for high-quality SaaS businesses exhibiting capital efficient growth, strong retention, and product differentiation. 

      Here are five highlights from the report:

      1. Aggregate Software Industry M&A deal volume has seen strong momentum in recent quarters, reaching 897 total deals in 2Q23 and up 5% from 855 deals in 1Q23

      2. Deal activity for SaaS M&A remains high relative to historical periods (538 in 2Q23). Although deal volume in 2Q23 experienced a 5% decrease over the prior quarter, SaaS M&A is on pace for the second-highest annual total in the last ten years (only eclipsed by the bubble year of 2022). The month of May saw 192 M&A deals, the second-highest monthly deal volume for SaaS in ten months.

      3. The average EV/TTM Revenue multiple for 2Q23 was 5.6x. However, specific cohorts within SaaS are continuing to sell for premium multiples. Strong outcomes are being had for companies fitting the profile from a SaaS KPI (capital efficient growth, strong retention, etc.) and product differentiation standpoint.

      4. Vertical SaaS comprised 46% of all M&A deals in 2Q23. Financial Services jumped up to the pole position of the verticals, representing 18.9% of all SaaS deals.

      5. Private equity appetite for SaaS M&A remains high as it represented the majority (61.3%) of deals in 2Q23. PE-backed strategics represented 52.4% of deals, and PE platform investments were 8.9%.

      Download the full report from Software Equity Group, here:

      Read More

      M.R. Asks 3 Questions: Evan Huck, Co-Founder & CEO of UserEvidence

      By Article

      With the decline of trust among B2B buyers because of vendor over-promising, economic pressures, and shifting expectations, CEO Evan Huck (and his co-founder Ray Rhoades) have been evaluating the evolution of social proof in the buying journey.

      Pulling from his experiences working at TechValidate and SurveyMonkey, Evan was inspired to create a company that could help businesses quickly and efficiently capture customer feedback and — leveraging the power of AI — automatically create on-brand content at scale, removing a significant source of friction from modern go-to-market teams’ sales motions.

      M.R. Rangaswami: Trust is at an all-time low for B2B buyers. What’s causing this and why does it matter?

      Evan Huck: B2B buyers are becoming increasingly skeptical of vendor marketing hype after repeatedly being burned by sales teams over promising and under delivering. Economic pressures have placed increased scrutiny on every tech purchase, upping the ante on the importance of making the right purchase the first time. Additionally, a recent Gallup poll found that greater access to information, lack of company focus on the customer lifecycle, and shifting expectations from a younger generation of buyers are all contributing factors to the breakdown of trust between vendors and buyers. As a result, peer recommendations and social proof are emerging as critical factors in the B2B buying journey.

      Why this matters? Vendors are no longer in control over the buyer journey, and they get less direct interaction with the prospect. Buyers expect to see relevant customer examples validated by real-world data before making large technology purchases. To rebuild trust with buyers, vendors need more than a handful of curated customer success stories – they need a library of authentic and relevant customer proof points that prove the product’s value across different use cases, company sizes, and industries. 

      M.R.: More than ever before, B2B buyers now look to their peers, not vendors, when making buying decisions. How is UserEvidence helping B2B software companies use customer feedback to address this new reality? 

      Evan: Historically it has been very difficult to gather enough reliable customer stories – seeking out these proof points is often labor intensive, laden with approvals, and costly. In the past, companies typically have created their own content in-house or leaned on an outside agency for support in collecting and creating these assets. These solutions have left companies scrambling to fill in the gaps as buyers demand more real-world examples they can connect to.

      UserEvidence resolves these issues by providing one platform that all go-to-market functions can use to capture customer feedback and — through advanced generative AI capabilities — deliver unbiased customer stories and beautifully designed assets for companies to use in their sales initiatives. Long gone are the days of analyzing customer data manually; UserEvidence processes these datasets quickly so that go-to-market teams can start creating content that attracts buyers. Companies can now easily collect and create these customer stories at scale, taking control of their most valuable asset: real-world social proof.

      Another benefit of the UserEvidence platform is the ability to continuously capture feedback and sentiment from users and customers, at important junctures in the customer journey. Surveys are delivered at key moments throughout the customer lifecycle, creating a continuous stream of learnings and insights that drives good decision making.

      M.R.: Getting feedback from actual customers helps not only B2B buyers, but every internal function across GTM teams. How does UserEvidence plan to bridge this gap?

      Evan: Every function in a B2B company — from the functions that sell a product (product marketing, sales enablement, customer marketing, and customer success), to the functions that build the product (product, product management, strategy) — should be guided by the voice of the customer and customer feedback.

      The problem is each function’s efforts to capture feedback are siloed, and the learnings from each effort aren’t shared between functions. Positive stories from a product management survey never make it into the hands of a sales team. Negative feedback from a marketing team’s efforts to find users willing to do case studies never makes it to product management or customer success.

      UserEvidence helps unify feedback collection efforts across functions, and helps each function take action on that feedback. Marketing can create on-brand sales and marketing assets, while product management can get insights on how to make the product experience better. Several goals are accomplished with one touch to the customer making for a more elegant customer experience.

      M.R. Rangaswami is the Co-Founder of

      Read More

      Quick answers to Quick Questions: Ivan Houlihan, SVP & Head of West Coast U.S for IDA Ireland

      By Article

      A slightly different conversation this week as we speak to Ivan Houlihan, Senior Vice President and Head of the West Coast of the United States for IDA Ireland–the Investment and Development Agency of the Irish Government, which promotes foreign direct investment into Ireland. 

      Based in California, Ivan leads the team that works closely with existing and potential clients in technology, financial services, life sciences and engineering throughout the Western US and Mexico. 

      We hope you enjoy this week’s angle about on Cybersecurity, cyber skills and microcredentials.

      M.R. Rangaswami: How Do Microcredentials Address the Cybersecurity Talent Scarcity Problem?

      Ivan Houlihan: While nations pass resolutions and laws that try to prevent cybercrime, the most widespread answer is increasing the supply of expert security talent to stay ahead of the criminals. 

      Ivan Houlihan suggests an innovative approach, which involves the concept of microcredentials. These are small, accredited courses that allow candidates to pursue highly focused upskilling and reskilling that responds to specific market needs. Besides creating qualified new candidates to quickly come on board, this solution opens the door to workers that might otherwise not have pursued careers in cybersecurity.

      As the head of the West Coast U.S. for IDA Ireland, Houlihan has seen an increasing number of American technology firms with operations in Ireland employ this strategy to address their cybersecurity talent crunch. 

      When it comes to microcredentials in cybersecurity, Houlihan believes that Ireland’s innovative training programs can become a model for other nations seeking to address the serious issue of cybercrime, which is predicted to cost the world $10.5 trillion by 2025. In this quick Q&A, he explains the basics of setting up a microcredentials program in the cybersecurity space – although microcredentials can be earned in other technical areas, too.

      M.R. Rangaswami: What are some of the current issues impacting cybersecurity staffing and why are microcredential programs a reasonable solution? 

      Ivan: Technology workers in general are often in short supply, but when it comes to qualified cybersecurity personnel, the problem is compounded by educational requirements along with needed specific skills that take time and money for those seeking to enter this field. Technical degrees, specialized training and often, often some graduate work, have discouraged many would-be candidates, particularly those put off by the prospect of student loans and related barriers. One of the biggest myths in the cybersecurity field is that it’s just for people with high proficiency in math, men only or those with certain graduate degrees. People also assume they must go to renowned universities to study for the field in order to pursue such careers. All these factors have conspired to decrease the pool of qualified candidates.

      Microcredential programs short-circuit the time and cost of pursuing a lucrative cybersecurity career, although the field does require some technical training as a starting point. Fortunately, being male, having graduate degrees and other assumptions don’t apply, however. Microcredentials bring down the cost and time commitments while increasing cybersecurity job opportunities for women, military veterans, minority groups, people from financially disadvantaged backgrounds, workers from other departments and others previously not often found in the profession. And since microcredential programs are typically online, they are of short duration and can be “stacked” or combined to form bigger accreditations – this makes it easier to get the right kind of training for a promising new career. The most successful microcredential programs demonstrate a collaborative effort between universities, governments, research institutions and industry, with the latter providing curriculum input based on what the candidates need to know to hit the ground running.

      M.R.: Describe the cybersecurity microcredential programs you’re aware of, how they operate and the results so far.

      Ivan: Technology workers in general are often in short supply, but when it comes to qualified cybersecurity personnel, the problem is compounded by educational requirements along with needed specific skills that take time and money for those seeking to enter this field. Technical degrees, specialized training and often, often some graduate work, have discouraged many would-be candidates, particularly those put off by the prospect of student loans and related barriers. One of the biggest myths in the cybersecurity field is that it’s just for people with high proficiency in math, men only or those with certain graduate degrees. People also assume they must go to renowned universities to study for the field in order to pursue such careers. All these factors have conspired to decrease the pool of qualified candidates.

      It’s encouraging to say Ireland has been ahead of other nations in its efforts to increase the supply of cybersecurity talent. Last year, the International Information System Security Certification Consortium, or (ISC)², the world’s largest IT security organization, released a report that found Ireland closed its cybersecurity skills gap by 19.5% while the global gap grew by 26.2%. Through a government grant in 2020, Ireland created Europe’s first microcredential program, called CyberSkills, a collaboration between national agencies, industry and three leading Irish universities led by Donna O’Shea, Chair of CyberSecurity, MTU

      Sign up and instruction are online and in addition to 30 carefully designed microcredentials that learners can take as standalone pieces of learning or integrated into predesigned academic pathways, the program utilizes what’s called the “cyber range,” a unique, cloud-based, secure sandboxed area that simulates real-world scenarios and environments where students can test their new skills. 

      In talking to O’Shea, she told us that CyberSkills has already trained hundreds of people – and the program is expanding. She believes that the simple but effective collaboration concept of this program could be duplicated by other nations wishing to accelerate and expand their supply of cyber talent. The key, underlying concept of CyberSkills is that the training is totally focused on graduates being able to walk into jobs immediately and have the knowledge they need to be effective. 

      At a higher level, everyone should look at these microcredential programs as a major innovation in workforce development and lifelong learning. Being largely co-designed by industry makes them relevant and effective while their ease of use and low cost create new avenues for skills development long into the future. 

      M.R. Rangaswami is the Co-Founder of

      Read More

      M.R. Asks 3 Questions: Sahir Ali, Founder of Modi Ventures

      By Article

      Dr. Sahir Ali is a technology and healthcare leader, investor and board advisor who has extensive experience in the areas of artificial intelligence, medical imaging, cancer research, enterprise technology and cloud computing. He has advised and led some of the fortune 500 companies, hedge funds and other organizations in implementing and integrating cloud technologies and artificial intelligence/data science.

      As founder of Modi Ventures, a private investment firm focused on investing in venture capital funds and early-stage startup in disruptive and emerging AI and Medical technology applications, we thought Sahir’s insights into the current investing trends of healthcare AI and TechBio would be insightful. 

      M.R. Rangaswami: What types of tech bio and healthcare AI investments are gaining funding in the current economic climate?

      Sahir Ali: Some of the most exciting breakthroughs in medicine today are happening at the nexus of biology and computer science, using tools such as artificial intelligence (AI).  There are two major tech-enabled bio investments themes: therapeutic platforms (drug discovery companies based on novel platform technology) and transformative technologies (companies developing applications of breakthrough technological advances such as genomics and digital health). 

      M.R.: What advice do you have for emerging startups to succeed in the crowded healthcare technology market? 

      Sahir: Startups that focus on platform technologies that can yield multiple programs and shots on goal, instead of individual assets with binary outcomes tend to be very attractive from investment perspectives, as well as time to market valuation proposition. We also encourage our founders to establish high-quality partnerships across the ecosystem — true platforms produce many more assets than any individual company can develop.

      The healthcare industry is slow to adopt new technology, so startups need to market their product effectively to reach the target audience, especially for digital health and consumer  products. 

      M.R.: What areas of investing in healthcare AI are gaining the most traction in this economy? 

      Sahir: There is a great deal of traction (funding and support) for companies that combine AI technology to generate novel candidates and strong drug development expertise to validate and find the best potential drugs. Another key area is gene therapy, which offers the potential to cure—not just treat the symptoms of—many major diseases. Some of the most transformative technologies are major new applications of genomics. Next-generation sequencing has outpaced even the fabled Moore’s Law, as the cost and information content of sequencing has improved even faster than the cost and information content of computer chips.

      Companies which incorporate nextgen sequencing into diagnostic applications can enable better clinical outcomes at radically reduced costs. When cancer is detected late, only 20% of patients survive for five years, but when detected early, 80% survive for five years. Early detection saves lives and billions of dollars per year in medical costs.

      M.R. Rangaswami is the Co-Founder of

      Read More

      Act Small: The Key to Growing Durable Companies & Communities with M.R. Rangaswami

      By Uncategorized

      For those of you who have followed M.R. and his illustrious career, you may know a little about his resume from four decades in Silicon Valley.

      However,  in M.R’s interview with DataStax Chairman and CEO, Chet Kapoor,  they both offer stories, humour, reflections and lessons that take us beyond their LinkedIn profiles and into the minds of some of our industry’s great builders. 

      We hope you enjoy this light-hearted conversation on your next commute.

      M.R Rangaswami is the Co-Founder of (the domain he bought for $20 in 1997)

      Read More

      M.R. Asks 3 Questions: Rahul Ponnala, Co-Founder & CEO of Granica

      By Article

      Rahul Ponnala is the co-founder and CEO of Granica — the world’s first AI efficiency platform — which is on a mission to make AI affordable, accessible and safe to use.

      He previously served as Director of Storage and Integrations at Pure Storage, where he engineered and integrated large-scale databases and file storage systems powered by all-flash technology. As a governing board member of The FinOps Foundation under The Linux Foundation, he helps shape the future of cloud financial management. A multidisciplinary academic, Rahul’s research spans mathematics, information theory, machine learning and distributed systems. He holds a portfolio of patents in computational statistics and data compression.

      M.R. Rangaswami: What are the hard business and/or technology problems that inspired you to found Granica?

      Rahul Ponnala: Advancements in deep learning have been powered by ever-larger models processing ever-growing amounts of data. The performance output of an AI algorithm is primarily determined by the diversity and volume of data it can access. So, as AI becomes integral to products and services in nearly every domain, access to “high quality” data will become both a critical necessity and a fundamental constraint, ultimately dictating the pace and effectiveness of AI investments at enterprises.

      To derive “high quality” data, enterprises must extract the maximum amount of information from their data stores and thereby maximize the value of their data – but the challenge here is two-fold. As data volume grows, so do the costs of managing, processing and storing it in the cloud.

      Second, as the potential for insight from new data sources increases, the risk of misuse and mishandling increases. Enterprises who can successfully contain rising cloud costs associated with growing data stores, while ensuring the safe use of data in AI to preserve its analytical value, will develop formidable, competitive moats.

      Since its inception, Granica has been developing cutting-edge and efficient solutions to allow enterprises to maximize the value of their data – our AI efficiency services are no exception. We are witnessing a Cambrian-like explosion in the pace of deployment of AI into various apps, products and services, marking a major technological shift in the future of computing. And while there has been meaningful progress on the computing infrastructure and algorithmic layers of AI and ML, there has been little progress in increasing the signal-to-noise ratio of the data fueling these algorithms.

      This is a very difficult problem, involving deep information and computer science developments, combined with large-scale systems engineering – and this is precisely the problem Granica is focused on solving.

      M.R.: How will your AI efficiency platform impact the future of enterprise AI/ML adoption? What is your advice to organizations that want to adopt a more efficient and productive cloud data architecture for their AI initiatives?

      Rahul: Extracting the maximum amount of information from data stores is perhaps the most critical
      element in the long-term success (or lack thereof) of an organization’s AI investments and strategy. So by delivering a platform capable of helping organizations do just that, Granica is democratizing access to AI by directly making AI more affordable, more accessible and safe to use.

      By now, most organizations have grasped the importance and criticality of integrating an AI strategy into their corporate planning-and in fact, this was the most popular question Wall Street analysts asked the management teams of big tech companies this past earnings cycle.

      Yet, most organizations – large and small – are left hamstrung in determining where to start and how to do so in an efficient manner, while operating under a set of both economic and time constraints imposed by the market.

      When speaking with customers about AI, the number one question that comes up is: “How can I get started and where should I get started?” And our answer, non-surprisingly, is: “Let’s first evaluate the effectiveness and efficiency of your organization’s data strategy.”

      By getting plugged into a customer’s environment and providing deep, informative analytics with respect to their cloud data stores and how their data is being used, we are able to provide direct visibility and insight into the inefficiencies present in that customer’s data architecture and gain a deep understanding of that customer’s data and workload characteristics.

      This then allows Granica to quickly configure and tailor our platform to their environment and thus accelerate the time to value for the customer. By providing customers with efficient building blocks and tools for their data architecture and AI-powered applications, we can help them optimize their data access, storage and compute resources and thus maximize the value of their data.

      M.R.: You’ve expressed that people are integral to your company. What are your values/philosophies as a leader with respect to growing successful teams?

      This not only allows us to bring our best professional selves to the office but also build long-term friendships and trust with one another. We want each of our employees to feel comfortable turning to one another to seek guidance, help and coaching – not just about “work”, but also about personal circumstances.

      At Granica, our employees, or “ninja warriors” as we like to call them, are the backbone of our organization. We share successes as a team, we make mistakes as a team and we challenge each other.

      By doing so, we leverage the collective intelligence of the whole to put everything we can into delivering exceptional experiences for our customers and inspiring one another along the way.

      Everyone at Granica lives by the motto of “Whatever it Takes” and we actually have this signage up on our wall in the lobby of our headquarters. It doesn’t matter whether you’re an individual contributor or manager at Granica – we want everyone to be leaders and we want to provide the resources, mentorship and growth opportunities to allow each ninja to grow their careers to new heights.

      M.R. Rangaswami is the Co-Founder of

      Read More

      Allied Advisers Sector Update onAutomation Software

      By Article

      Allied Advisers has published their sector update on Automation Software which provides an overview of this important segment, recent exciting trends, the transactional market and active acquirers and investors in the ecosystem.

      Automation technologies are becoming increasingly pervasive across industries driven by the clear opportunity to achieve great improvements in productivity, process efficiencies and reduce human errors. Tailwinds to this sector have been strengthened by rising costs of labor and operations in an inflationary environment, and innovations in enabling technologies such as AI/Machine Learning, IIoT and Cloud.

      The increasing adoption of automation will necessitate further investment into developing technology skills. It is expected that many manually repetitive and low-skill jobs will be replaced by automation technologies, leading to higher unemployment in the economy. On the positive side, automation also opens up the opportunity for workers to be freed up from mundane tasks. Workers who retool and elevate their skill sets for the new world will be able to use their time more effectively and work well with machines to their benefit.

      Download their full report here:

      Gaurav Bhasin is the Managing Director of Allied Advisers

      Read More

      Quick Answers to Quick Questions: Aidan McArdle, VP of Technology, Cirrus Data

      By Article

      Aidan McArdle serves as the VP of Technology for Cirrus Data, a leader in block data mobility technology and services. Prior to joining Cirrus Data, Aidan worked at Hewlett Packard Enterprises (HPE) for 17 years, focusing on enterprise storage, servers and operating systems.

      In his role at Cirrus Data, Aidan leads a global team to solve complex problems with great technology, develops global services programs, and leads all aspects of pre-sales, product development, and partner management for major initiatives. Aidan also serves as EMEA Partner Enablement Director, helping partners and customers deliver success with their software.

      M.R.: What is the most important cloud trend today and what makes it so important?

      Aidan McArdle: Top of mind for organizations continues to be cloud adoption, but there is also a strong focus on FinOps or to put it simply – cost optimization, governance, and control. The IT landscape has been awash with layoffs for more than a year now and every enterprise is tightening purse strings as operating expenses (OPEX) comes under increased scrutiny from those paying the public cloud bills. 

      When storage was largely on-premises, production environments were almost always overprovisioned. It was all capital expenditures that were planned well in advance, and it wasn’t uncommon to have 30-40% utilization. In the cloud, the costs are monthly, and any capacity wasted is hitting their OPEX budgets. Cost control and optimization have become the norm for enterprises, which are striving to find more cost-effective ways to deliver their desired level of performance, reliability, and security.

      M.R.: How is cloud computing today impacting CIOs and their enterprises?

      Aiden: How to best benefit from the cloud will be (or at least should be) at the top of each CIO’s goals for 2023. It’s very hard to find an enterprise that has not seen a fallout from the post COVID slow down.

      The race to the cloud and the need to accelerate digital transformation has delivered many lessons in the last three years. In the rush to flexibly scale and deliver agile applications, many created straightforward ‘Lift and Shift’ plans. The idea being the organization would be able to take the database or application running on-premises and move it to the cloud themselves with little effort. What we’ve seen is for those organizations that managed to get pieces of their workloads into the cloud themselves, they are struggling with huge cost overruns. Other organizations are stuck in delays trying to determine the best path forward.

      With a renewed focus on optimization, control, and governance, we will see a positive impact. Costs should be controlled and likely reduced while teams gain a focus on the value of FinOps. 

      I‘ve had a number of really interesting conversations with businesses about the cost of cloud, repatriation and the shift back to on-premise. We have helped some organizations repatriate their workloads as they realize that for their environment using on-premises or a hybrid cloud strategy is ideal. And for others we have found they can meet their goals without a lot of post migration pain by analyzing their workloads and optimizing ahead of moving them to the cloud.

      The focus and thought process has sparked several interesting debates at management meetings this year and hopefully resulted in a plan to gain control over the cloud spend at many enterprises.

      M.R.: What else should organizations be thinking about when considering cloud best practices?

      Aiden: I don’t believe any organization is too small to look at FinOps and cost optimization. The fundamentals can help set down best practices for organizations of all sizes. For companies that are evaluating a cloud strategy in 2023 or 2024, I always recommend including the migration as part of the strategic planning. Migration is often an afterthought, and this leads to challenges. When accurate planning is not in place to connect people, process, time, and budget to deliver on the intended outcomes you will always find problems, on the contrary though, when the migration is planned properly it is generally executed faster and with minimal impact to the business.

      M.R. Rangaswami is the Co-Founder of

      Read More

      M.R. Asks 3 Questions: Dror Weiss, Founder & CEO of Tabnine

      By Article

      Founder and CEO of Tabnine, Dror Weiss and his team are the creators of the industry’s first AI-powered assistant for developers. As a generative AI technology veteran, he is on a mission to help developers and teams create better software faster. 

      In this quick conversation, Dror discusses how developers can leverage generative AI today, cover how open source is advancing the generative AI moment and share his thoughts on what’s to come. 

      M.R. Rangaswami: How can developers take advantage of generative AI technology today? What can they expect in terms of benefit?

      Dror Weiss: Software developers can leverage generative AI for code today, in fact Tabnine has around 8M installs from VS Code and the Jetbrains Marketplace. Developers will see the most immediate benefit if they are working on languages that have a large open source example set (JavaScript, Python, etc). However, the value of generative AI for code is likely even higher with esoteric languages and unique code that are currently in the domain of enterprises.

      Code completion numbers vary significantly (25-45%), but with detailed ROI studies our customers are seeing mid-teens to low twenties in actual productivity uplift.

      M.R.: How is open source helping to advance the generative AI movement? 

      Dror: At the moment, open source cannot compete on spending and building the largest of models (i.e. GPT4) because currently these cost hundreds of millions of dollars and pull in as much data as possible.

      However, we are already seeing strong evolution of open source to build smaller models that are built specially for use cases, such as code. We believe these specialized models are the way forward and have already significantly closed the difference with the largest models.

      Much like Linux became the default for operating systems, we expect that open source will do the same for AI.

      M.R.: What’s next for generative AI – for developers, the enterprise?

      Dror: For developers, we believe generative AI for code will continue to expand into areas such as testing, chat and custom models. As for the enterprise, they are pushing for secure and controlled solutions, indicating they are all in on generative AI. 

      M.R. Rangaswami is the Co-Founder of

      Read More

      M.R. Asks 3 Questions: Pranay Ahlawat, Partner & Associate, Boston Consulting Group

      By Article

      We’re long past being able to escape Generative AI as a weekly conversation topic. From keynotes at software company conferences, to investment themes for VC/PE investors, we cannot escape Generative AI as a conversation topic today

      We reached out to Partner and Associate Director at Boston Consulting Group, Pranay Ahlawat, after reading his article on Regenerative AI Trends That Really Matter. We were impressed and intrigued with how Pranay sees this topic from multiple angles – advising clients, advising investors and a practioner, and wanted to share his insights with our Sandhill’s executive network.

      Pranay’s focus on enterprise software and AI at BCG help him discern the hype from reality and understand the true trends that really matter and what software companies, enterprises and investors must know about Generative AI.

      M.R. Rangaswami: We have certainly been in hype cycles in the past, what is different about Generative AI and why does it matter?  

      Pranay Ahlawat: Foundational models or the problem of natural language conversation isn’t new. Natural Language Processing, chatbot platforms and out-of-box text APIs from cloud vendors have been around for a decade today. Foundational models like Resnet-50 have been around since 2015. There are two things that are different about modern-day Generative AI. 

      First, modern language models or Large Language Models (LLMs) are architecturally different and have a significant performance advantage over traditional approaches like Recurrent Neural Networks and LSTM (Long Short-Term Memory). You will often hear the word transformers and “attention”, which simply put is the ability of the model to remember the context of the conversation more effectively. The quality of comprehension and ability to generate longer free-form text is unlike what we have seen in the past. 

      Second, these models have a killer app unlike any other and is immediately consumable by non-technical users. We have had transformative technology breakthroughs in the past – internet, mobile, virtualization and cloud, but nothing has come close to the astonishing rise of Chat GPT, which reached a hundred million users in about two months. This tangibility has added to the hype and despite the huge potential, a lot of the claims about Generative AI are unrealistic. 

      It matters because of the potential impact it has on society. We are a small step closer to general intelligence, we can potentially solve problems we weren’t able to solve before. It’s disruptive for many industries like media, education and personalization. Time will tell how quickly this will happen. 

      M.R.: What are the three things people must know about Generative AI today?

      Pranay: For me the three underlying principles or things you must know – (1) Generative AI is getting democratized, (2) the economics of Generative AI are a crucial vector of innovation and (3) The technology itself has limitations and risks. 

      First, the technology at the platform level is already democratized and the barriers to entry are continuing to go down. If you look at the commercial players – there are model vendors like Cohere and Antrhopic, platform vendors like Google, AWS and multiple other tooling and platform vendors e.g. IBM WatsonX, and Nvidia NeMo, all making it easier to build, test and deploy generative AI applications. There is real excitement in open source and community driven innovation at all layers e.g. frameworks like PyTorch, foundation models like Stable Diffusion and LLaMA, model aggregators like HuggingFace and libraries like Langchain. Today, a developer can create a generative AI application in a matter of hours, and a lot of complexity is abstracted away because of modern tooling. We have more than five hundred generative AI startups already, and the barriers to entry are continuing to come down.  

      Second, winners will know how to get the economics right. These models are incredibly expensive to train, tune and run inference on. A 300B parameter model costs anywhere from 2-5M in compute costs to train, and models like GPT-3 costs 1-5 cents per query. To give you an intuition – if Google ran a modern large LLM like GPT-4 for all search queries – it will see profits go down by roughly 10B. So, understanding the task and architecting for the right price/performance is an imperative. There is a ton of innovation and focus on cost engineering today – from semiconductors to newer model architectures and training and inferencing techniques that are focused on getting this price/performance balance right. 

      Third, there are well documented risks that are still not fully understood. The problem of bias and hallucinations is well documented, there are also unknown cybersecurity risks copyright and IP issues that enterprises need to worry about. Lastly, these models are only as good as the data used to train them, and they make mistakes – Google Bard’s infamous factual error on debut is a good reminder that AI is neither artificial, nor intelligent. 

      M.R.: Where are we in the adoption curve of Generative AI and where do you believe this is all going?

      Pranay: We are still early innings here. We are seeing a ton of enterprises experiment and run pilots and POCs, but almost no adoption at scale. There are certain use cases like Marketing, Customer Support and Product Development that are more ready and have out-of-box tooling e.g. Jasper and GitHub CoPilot etc. The reported performance gains vary significantly, however. There are many numbers, even from reputable sources which are conjecture without any tangible evidence. Companies should evaluate these tools and assess impact before building business cases. 

      I believe the adoption in the enterprise will be slower than most estimates. Many underlying reasons for that – lack of a strategy and clear business case, lack of talent, lack of curated data, unknown technology risks etc. The biggest challenge is that of change management – according to BCGs famous 70:20:10 framework, 70% of the investments in adopting AI at scale is tied to changing business processes vs. 20 in broader technology and only 10% in algorithms. These physics will remain the same. 

      We must also acknowledge that the generative AI itself isn’t a silver bullet and we are the very top of the hype cycle. Get your popcorn, the movie has just begun!     

      M.R. Rangaswami is the Co-Founder of

      Read More

      M.R. Asks 3 Questions: Molham Aref, CEO of RelationalAI

      By Article

      AI is the conversation we can’t get away from, so we’re doing our best to bring you as many perspectives, experts and insights into how enterprises are adapting, incorporating and utilising its rapid advancements.

      Molham Aref is CEO of RelationalAI, an organisation building intelligence into the core of the modern data stack. He’s had a more than 15 year career in AI where he has been investigating and implementing how knowledge graphs and covers benefits the build of intelligent data applications.

      M.R.: Generally speaking, how do you see AI advancing enterprise?

      Molham Aref: AI is an expansive concept that encompasses a wide range of predictive and analytical technologies. Gartner coined the term Composite AI to reflect the fact that AI in the enterprise is combining these technologies to help build intelligence into organizations’ decision making and applications. AI provides great opportunities to drive smarter and more insightful outcomes.

      Using AI, organizations can improve their decision making and achieve more reliable outcomes. The emergence of large language models (LLMs) has driven AI to an inflection point that requires a combination of techniques to generate results that cannot be achieved by point solutions.

      By leveraging AI, organisations can make accurate forecasts, anticipate customer behavior, and optimize resource allocation. This allows them to proactively address challenges, identify opportunities, and ultimately become more profitable. 

      M.R.: How are you incorporating knowledge graphs working with AI and Enterprise?

      Molham: Knowledge graphs were pioneered by technology giants like Google early on to improve search results and LinkedIn to understand connections between people. The technology models business concepts, the relationships between them, and an organization’s operational rules.  

      Specifically, a knowledge graph organizes data designed to be human-readable, augmenting it with knowledge about the enterprise in a way that allows organizations to  take their data, reason over it, and create inferences with the goal of making better decisions. This can be done in a variety of ways, including with graph analytics, which focuses on connections in the data.

      Organizations can augment their predictive models with an understanding of the relationships that exist between their data, for example, inventory and profit. These enhanced models enable organisations to arrive at decisions that make them more effective, more competitive, and more successful. 

      Knowledge graphs are proving to be one more tool in the toolbox that will significantly advance the enterprise.

      M.R.: What do you see the future benefits being for organisations who build intelligent data applications?

      Molham: Imagine a world where applications seamlessly adapt to your data, driven by intelligent capabilities. Where your applications can take action on your behalf, notify you to make important decisions, and dynamically make recommendations in response to sudden changes.

      Once organizations understand the potential impact of AI, they start to embrace technologies like knowledge graphs and data clouds. And with the modern AI stack complete, they can start building applications that let them automate workloads.

      With intelligent applications making the easy decisions, humans are freed up to work on the things that are more interesting and complex. Intelligent applications take the drudgery and tedium out of business operations, so that experts can focus more of their time and energy on decisions and tasks that will have a bigger impact, are harder to make, or require more human ingenuity than can be codified in software. 

      M.R. Rangaswami is the Co-Founder of

      Read More

      M.R. Asks 3 Questions: Dr. Alan Baratz, CEO of D-Wave

      By Article

      Dr. Alan Baratz’s career picked up momentum when he became the first president of JavaSoft at Sun Microsystems. He oversaw the growth and adoption of the Java platform from its infancy to a robust platform supporting mission-critical applications in nearly 80 percent of Fortune 1000 companies. It was that vast experience, amoung many, that brightly lit the path for Alan’s next role with D-Wave.

      First, as D-Wave’s Executive Vice President of R&D, Alan was the driving force behind behind the development, delivery, and support of all of D-Wave’s products, technologies, and applications. Now, spending the last three years as D-Wave’s CEO, Alan is building his expertise to hit a new stride and take his organization to the next level.

      M.R. Rangaswami : Can you provide an overview of D-Wave’s technology and the state of the quantum computing market today?

      Dr. Alan Baratz: It’s an incredibly exciting time in the quantum computing market, as we’re starting to see companies and governments around the world increasing both interest and investment in the
      technology. In fact, a study from Hyperion Research found that more than 80% of responding companies plan to increase quantum commitments in the next 2-3 years and one-third of those will spend more than $15 million annually on quantum computing efforts.

      The accelerated adoption of quantum computing comes at a time when businesses are facing difficult economic headwinds and are looking for solutions that help reduce costs, drive revenue and fuel operational effectiveness. Quantum’s power and potential to tackle computationally complex problems make it an important part of any modern enterprise’s tech stack.

      And the market potential is significant. According to Boston Consulting Group, quantum computing will create a total addressable market (TAM) of between $450–$850 billion in the next 15 to 30 years, reaching up to $5B in the next three to five years. Many problems, especially those relating to optimization, can be solved with today’s systems.

      There are two primary approaches to quantum computing – quantum annealing and gate
      model. While you may have heard that quantum computing won’t be ready for years, that longer timeline refers only to gate model.

      The reality is that practical quantum solutions, those that use quantum annealing systems, are already in market now, helping organizations solve some of their biggest challenges.

      D-Wave customers are using our Leap TM quantum cloud service to gain real-time access to our
      quantum computers and hybrid solvers to tackle some of their most complex optimization
      problems. We offer a full-stack quantum solution – hardware, software and professional
      services – to give customers support throughout their quantum journey. And given our QCaaS
      (quantum computing-as-a-service) approach, we make it very easy for the enterprise to
      incorporate the technology into their compute infrastructure.

      M.R.: What are some examples of commercial applications you’re seeing?

      Alan: Optimization is an enterprise-wide challenge that businesses of all kinds face – whether they’re in financial services, manufacturing, logistics, life sciences, retail or more. Many common yet computationally challenging problems like employee scheduling, offer allocation, e-commerce delivery, cargo logistics, and supply chain distribution can all be represented as optimization problems, and thus solved by today’s quantum annealing technology. These problems are made more difficult by the vast amount of data generated daily, which can quickly translate into critical pain points that impact a business’ bottom line.

      We’re seeing organizations increasingly turning to quantum-hybrid applications to address these optimization challenges. For example, the nation’s largest facility for handling shipborne cargo used D Wave technology to optimize port operations, resulting in a 60% increase in crane deliveries and a 12% reduction in turnaround time for trucks.

      A major credit card provider is using quantum-hybrid applications to optimize offer allocations for its customer loyalty and rewards program to increase cardholder satisfaction while maximizing campaign ROIs. And a defense company created a quantum-hybrid application for missile defense that was able to consider 67 million different scenarios to find a solution in approximately 13 seconds.

      The commercial value is apparent, and if you’re not currently exploring quantum in your enterprise, I believe you’re already behind.

      M.R.: What’s next for quantum computing?

      Alan: The pace of innovation and progress in quantum computing is remarkable. From a commercial exploration and adoption perspective, I believe we’re going to see a major uptick in the near term, as more organizations recognize the technology’s potential and increase investments. Quantum has moved out of the lab and into the boardroom.

      It’s no longer just relegated to the R&D teams to play with, but rather has captured the attention of business decisionmakers faced with increasingly challenging and complex problems that require faster time-to-solution. With the increased adoption will come rapid development of proofs-of-concept and ultimately production applications that will help streamline daily enterprise operations.

      From a scientific view, I expect major developments on the horizon as quantum annealing technology further scales and reaches even higher qubit counts and coherence times. Gate-model development will continue to progress, as the industry hopes to eventually find a path toward low-noise systems that can actually solve problems. Lastly, we all will continue our efforts to demonstrate quantum’s advantage over classical compute for intractable problems.

      We’re already seeing positive signs at D-Wave, as recent research findings contribute to a
      growing body of research that may lead us to the first practical quantum supremacy result.

      Read More

      M.R. Asks 3 Questions: Riddhiman Das, Co-Founder & CEO, TripleBlind

      By Article

      On a recent tour of healthcare organizations across the nation, Riddhiman started closely evaluating how different organizations are securing their data and even more important, securely accessing/sharing data.

      From developing new drugs and medical devices to allocating scarce resources amidst supply chain issues, most advancement in healthcare hinges on having access to the right data. Moreover, some of the most sensitive and highly regulated data requires technology solutions that take all of that into account to solve this complex challenge.

      Riddhiman recognizes how the traditional solutions used to tackle data problems but has solutions on how the next wave of innovation can allow the healthcare industry to gain insights from health data while maintaining privacy.

      M.R. Rangaswami: Data is arguably the most critical driver of innovation in healthcare today. What trends is this driving and what are some key “amount of data” stats in healthcare?

      Riddhiman Das: I believe that data is the most critical driver of innovation in healthcare but there are limitations because the data is sensitive and as a result, regulated. Everything in healthcare hinges on having access to the right data: From developing new drugs and medical devices to allocating scarce resources amidst supply chain issues.

      It’s no secret that having continuous access to raw health data is invaluable— this fact is well established. However, recent advances in analytics, machine learning, and artificial intelligence have brought us to a tipping point where healthcare can no longer ignore the value of having access to data. 

      And get this, privacy and compliance concerns have trapped two Zettabytes of data in silos and removed $500B in value creation for healthcare organizations.

      M.R.: If we know healthcare has a data problem, how have we traditionally been trying to tackle it?

      Riddhiman: Historically, organizations have tried to get around limited access to data by using synthetic, abstracted, or pre-anonymized datasets, but that strategy just doesn’t cut it. The method tends to be expensive and can result in flawed insights if the data contains errors or is missing a key element –  that doesn’t really benefit anyone. 

      We need access to data to drive the next wave of innovation—people’s health and well-being depend on it. We can only achieve this if the data is kept private to maintain patient privacy and the intellectual property rights of healthcare companies and their industry partners. 

      Over the years, initiatives have emerged to address this. Everyone has heard of HIPAA, which was enacted to protect patients’ health information from disclosure without their consent or knowledge. It also features standards designed to improve efficiency in the healthcare industry. The less-talked-about Sentinel Initiative was created to monitor the safety of medical products via direct access to patients’ electronic health records. Despite legislation and initiatives to help with this problem, the challenge remains and will only become more amplified as health data grows in volume and complexity. 

      Organizations have been shooting themselves in the foot by relying on manually de-identifying, abstracting, or normalizing data to get the insights they need. It’s nearly impossible to obtain meaningful, accurate, real-time insights from health data in this manner. This outdated method is hardware dependent, poses potential risks for re-identification, offers only partial security, and generally only works on structured or specific types of data. 

      M.R.: What are some fresh solutions to data and data privacy in healthcare you have seen?

      Riddhiman: We’ve seen quite a few technology solutions developed in recent years that tackle this issue in a way that allows healthcare organizations the ability to gain insights from data and maintain privacy beyond what regulations require. 

      Privacy-enhancing technologies (PETs) were specifically designed to make gleaning insights from health data scalable, accurate, and secure: a true win-win. One PET we’re truly excited about? Federated analytics.

      Federated analytics improves upon prior PETs and keeps health data safe in three ways. First, the data is secured at its point of residence so that external parties cannot access it in any meaningful way. Second, the data is kept secure as parties collaborate to decrease the risk of interception. Finally, the data is secured during computation, reducing the risk of sensitive information extraction. Organizations can also track how the data is used to ensure it is only leveraged for its intended purpose.

      Federated analytics software lowers the risks associated with sharing health data by eliminating decryption and movement of raw data, while allowing privacy-intact computations to occur. Additionally, technology improvements driven by federated analytics minimize the computational load necessary to analyze data, which reduces hardware dependency and increases scalability.

      Other benefits include access to raw data beyond just structured data, including video, images, and voice data; more secure internal (across regulatory boundaries) collaboration and external (between organizations) collaboration; and a lower chance of non-compliance due to simplified, more cohesive contracting processes. 

      Federated analytics is driving healthcare towards the future. By safely scaling access to raw health data, organizations can optimize processes for clinical trials, develop and deploy groundbreaking AI algorithms, and bolster pharmacovigilance. Thanks to the development of federated analytics solutions, there is no longer a need to choose between gaining powerful insights that will shape the future of healthcare and keeping patient data private.

      M.R. Rangaswami is the Co-Founder of

      Read More

      M.R. Asks 3 Questions, Slavik Markovich, Co-Founder & CEO, Descope

      By Article

      Does password authentication really work anymore?

      Descope Co-Founder and CEO, Slavik Markovich, has been watching the unravelling problem with traditional password authentication, such as user difficulties and security vulnerabilities, for years.

      As a solution, Descope is developing sound passwordless methods, such as magic links, one-time passwords, social login, authenticator apps, and biometric authentication, that are gaining traction due to the rise of open standards and support from major companies like Google, Apple, Microsoft, and Shopify.

      In this conversation, Slavik get’s straight into the user experience and the solutions we are seeing that work.

      M.R. Rangaswami: Why is passwordless authentication picking up steam? 

      Passwords also cause friction throughout the user journey, leading to churn and a negative user experience. No one wants the cognitive load of remembering unique 16-character passwords for every site or app they access, so they reuse passwords across sites which is a recipe for disaster when passwords get leaked.

      Passwordless methods such as magic links, social login, and authenticator apps have been around for a while. Notable apps like Medium and Slack already use passwordless login, while authenticator apps are used as a common second factor in MFA.

      However, the rise of open standards and mechanisms such as FIDO2, WebAuthn, and passkeys over the past few years have sent passwordless adoption into overdrive. There are a few reasons at play here:

      • Passkeys are based on biometrics, which users are familiar with since they already use fingerprint scanning and facial recognition to unlock their phone or other computing devices.
      • Passkeys are being adopted by Internet heavyweights such as Google, Apple, Microsoft, and Shopify, who are also taking steps to educate users about the benefits of these methods.

      M.R.: What are some examples of passwordless authentication techniques? 

      Slavik: Passwordless methods verify users through a combination of possession (what they have) and inherence (who they are) factors. These factors are typically harder to spoof and are more reliable indicators of a user’s identity than knowledge factors are.

      These examples include: 

      • Magic links, which are URLs with embedded tokens that – when clicked – enable users to log in without needing a password. These links are mostly delivered to the user’s email account, but can also be sent via SMS and other messaging services like WhatsApp.
      • One-time passwords / passcodes, which are dynamically generated sets of numbers or letters meant to grant users one-time access to an application. Unlike passwords, an OTP is not static and changes every time the user attempts login.
      • Social login, which authenticates users based on pre-established trust with an identity provider such as Google, Facebook, or GitHub. Using social login precludes users from creating another set of credentials – they can instead focus on strengthening the passwords they already have on their identity provider account.
      • Authenticator apps, which operate based on time-based one-time passwords (TOTP). A TOTP code is generated with an algorithm that uses a shared secret and the current time as inputs – this means the code changes at set time intervals, usually between 30 to 90 seconds.
      • Biometric authentication, which are physical or behavioral traits that are unique to an individual. Biometric authentication checks these traits to grant users access to applications. Popular biometric authentication techniques in use today include fingerprint scanning and facial recognition. Biometrics are used in passkeys authentication, which I covered in the previous answer.

      M.R.: How do you see this technology evolving over the next several years? 

      Slavik: I see the evolution of passwordless technologies mostly focusing on education and compatibility in the years to come. The key pillars will be:

      • User education: Companies and the industry at large need to continue educating end users about the benefits of passwordless methods and the pitfalls of passwords. There are still myths about passwordless methods like biometrics that are common (e.g. what if someone steals my biometrics?) that need to be addressed (e.g. your biometrics never leave your device).
      • Developer education: Standards and protocols such as OAuth, SAML, WebAuthn, and others that form the basis of authentication mechanisms are complex. It takes developers time to pore over these protocols and implement authentication in their apps. Developers need to be provided with tools and enablement that abstract away the complexity of these protocols and let them add passwordless methods to their apps without lots of added work.
      • Compatibility: Passkeys compatibility is a work in progress. Over the coming months and years, more apps, browsers, and operating systems need to support passkeys if a passwordless future is to become reality. 

      All three points above are interrelated. If user education and developer enablement continues improving, more entities will be incentivized to add passwordless support, and vice versa. 

      M.R. Rangaswami is the Co-Founder of

      Read More

      Vertical SaaS vs. Horizontal SaaS

      By Article

      Horizontal SaaS vs. Vertical SaaS – Which flavor of SaaS do you prefer?

      Allied Advisers has updated their previously published Flavors of SaaS report where they include an analysis across a select group of companies comparing operational metrics across the two flavors of SaaS; Horizontal SaaS and Vertical SaaS.

      As advisors who have worked across both flavors, they’re sharing some interesting differences.

      While Horizontal SaaS companies generally have larger TAM, Vertical SaaS companies can be more capital efficient and have better operational metrics and capital efficiency, making them better suited for middle-market funds.

      While there are category leaders in Horizontal SaaS, there are also a lot of opportunities in building Vertical SaaS companies which can become leaders in their own sectors. In today’s environment where capital efficient growth is being keenly measured, Vertical SaaS companies offer compelling opportunities for investors and buyers.


      I. Many SaaS firms focus on Vertical SaaS models to target a specific niche, allowing them to better serve industry specific client demands and making them easier to market.

      II. Vertical SaaS has seen rapid growth of businesses with smaller but more focused TAM (as compared with Horizontal SaaS) and generally more capital efficient business models.

      III. The market downturn in 2022 and Covid impacted some Vertical SaaS markets but overall digital transformation continued to accelerate within industries, with standardized solutions not being sufficient to address vertical needs.

      IV. We see continued investor interest in Vertical SaaS due to high growth prospects supported by strong business fundamentals, along with generally better performance on multiple metrics than peer Horizontal SaaS companies.

      For the full Allied Advisors report, see below:

      Gaurav Bhasin is the Managing Director of Allied Advisors

      Read More

      M.R Asks 3 Questions: Peter Maier, SVP of SAP

      By Article

      With a new book on the market, Business as UNusual with SAP, we have been looking forward to talking with Vinnie Mirchandani and his two senior VP’s at SAP about the megatrends they’re seeing powerfully ripple across the industry.

      As Co-Author and SVP of Strategic Customer Engagements at SAP, Peter Maier, was great to speak with to elaborate on how megatrends are changing competitive playing fields and shaping best business practices.

      M.R. Rangaswami: What was the motivation for you and your co-author, Thomas Saueressig, to write the book, Business as UNusual with SAP?

      Peter Maier: In our customer conversations Thomas and I experience every day how megatrends are driving the business and technology agenda of our customers. We found it worthwhile to share their voice and perspective how leaders successfully navigate industry megatrends using the capabilities of our intelligent suite and our intelligent technologies. 

      There are a few simple but deep principles that drive SAP’s product and innovation strategy for our customers in their industries: we focus on our customers’ core business, because that’s where they drive revenue, competitive differentiation and strategic transformation of business models and business operations.

      Then we look at end-to-end processes that run along value chains and across industry and company boundaries (that’s why digital business networks are so important). And we use a business filter when we look at new digital technologies: which have the potential to transform our customers’ business?

      Artificial intelligence is a great example here, we believe there is huge business potential – but realizing this potential requires integrated end-to-end industry processes. So each megatrend can transform the business of our customers in their industries – and digital technologies are key enablers.

      M.R. In your opinion, what makes this period of time “unusual”?

      Peter: All consultants have been claiming for decades that the ongoing change requires customers to adjust their strategies and operations. However, the last three years have shown us how fundamentally and quickly our world can change and how important the ability to rapidly adapt to change has become. Multi-year corporate programs have been compressed into quarters, months, and weeks. Fundamental beliefs have gone out of the window. And we perceive a new open-mindedness of many leaders to try new things – to embrace the idea to run a “business as unusual”. So we think it makes sense to use this momentum and start customer engagements to discuss how megatrends can inspire new ways of doing new things. 

      Many people feel threatened by change. If you look into the root cause for this reaction, you’ll find that change is stressful if it outpaces your ability to adjust or even take advantage of it. This is a very good reason to build and run an organization so that it can easily (or at least better than their peers) cope with disruptive change. And this change comes from all directions, just look at the drivers like generative AI, sustainability, virtual reality, metaverse, geopolitical conflicts, or pandemics. “Prepping” for all eventualities is certainly not the answer, but building and running an intelligent, sustainable, resilient, and agile enterprise certainly is. And many companies and institutions look at SAP to find solutions for this transformation.

      M.R. What are the most opportunistic and problematic trends that the book covers?

      Peter: We believe that every single megatrend we are discussing holds threats and promises, depending on the reader’s attitude to running a “business as unusual.” Moving from selling products to providing and monetizing the outcome of using the product (“Everything as a service”) can be viewed as a problem for a business – or it can be treated as a great opportunity to create and expand new revenue streams, develop new business models, and establish fresh customer relationships.

      Moving to a “circular economy” drives change in product design, supply chain, procurement practices, and product-end-of-life management in many industries. Whether this change is a reason for optimism or pessimism depends on whether this change is viewed as an opportunity or threat. And you will find the same duality in every single megatrend.

      Over the course of our research and the discussions with customers, partners, and SAP experts the opportunity/threat balance clearly shifted from seeing problems and challenges to appreciating the potential for innovation and new business relationships. And of course, we are very happy and pleased that our SAP solutions will play key roles in tackling the challenges and capturing the promised value from transforming business processes and business models.

      There are many digital technology trends – most prominently artificial intelligence – which we don’t feature in Business as UNusual with SAP as megatrends.

      Business as UNusual with SAP focuses on business megatrends and how they shape and change competitive playing fields and best business practices, or how they transform end-to-end business processes along value chains and across industry boundaries.

      Technology has always influenced, accelerated, and sometimes triggered business megatrends, and you will find that digital and other technologies and their impact are discussed in the context of each megatrend, from Lifelong Health to New Customer Pathways and from Integrated Mobility to the Future of Capital and Risk.

      M.R. Rangaswami is the Co-Founder of

      Read More

      M.R. Asks 3 Questions: Dr. Yu Xu, Founder & CEO of TigerGraph

      By Article

      With 26+ patents in parallel data management and optimization, TigerGraph’s founder and CEO, Dr. Yu Xu, has extraordinary expertise in big data and database systems.

      Having worked on Twitter’s data infrastructure for massive data analytics and led Teradata’s big data initiatives as a Hadoop architect, not only does Yu have an impressive resume, but his ability to explain detailed concepts in a simplified way made for easy conversation.

      M.R. Rangaswami: Graph databases are gaining momentum as more organizations adopt the technology to achieve deeper business insights. What exactly is a graph database?

      Yu Xu: The world is more hyper-connected than ever before, and the ability to tap into the power of rich, growing networks – whether that be financial transactions, social media networks, recommendation engines, or global supply chains – will make or break the bottom-line of an organization. Given the importance of connections in the modern business environment, it’s critical for database technology to keep up.

      Legacy databases (known as relational or RDBMS) were built for well-mapped, stable and predictable processes like finance and accounting. These databases use rigid rows, columns and tables that don’t require frequent modifications, but are costly and time-consuming when adjustments need to be made.

      The graph database model is built to store and retrieve connections from the ground up. It’s more flexible, scalable and agile than RDBMS, and is the optimal data model for applications that harness artificial intelligence and machine learning. 

      A graph database stores two kinds of data: entities (vertices) and the relationships between them (edges). This network of interconnected vertices and edges is called a graph. Graph database software stores all the records of these interconnected vertices, attributes, and edges so they can be harnessed by various software applications. AI and ML applications thrive on connected data, and that’s exactly what graph technology delivers.

      M.R.: What’s the difference between native and non-native graph databases?

      Yu: As graph technology grows in popularity, more database vendors offer “graph” capabilities alongside their existing data models. The trouble with these graph add-on offerings is that they’re not optimized to store and query the connections between data entities. If an application frequently needs to store and query data relationships, it needs a native graph database. 

      The key difference between native and non-native graph technology is what it’s created for. A native graph database uses something called index-free adjacency to physically point between connected vertices to ensure connected data queries are highly performant. Essentially, if a database model is specifically engineered to store and query connected data, it’s a native graph database. If the database was first engineered for a different data model and added “graph” capabilities later, then it’s a non-native graph database. Non-native graph data storage is often slower because all of the relationships in the graph have to be translated into a different data model for every graph query. 

      M.R: What are some ways that businesses are leveraging graph databases?

      Yu: The use cases for graph technology are vast, diverse, and growing. If an application frequently queries and harnesses the relationships between users, products, locations, or any other entities, it will benefit from a native graph database. The same is true if a use case leverages network effects or requires multiple-hop queries across data.

      Some of the most popular use cases for graph include fraud detection, recommendation engines, supply chain management, cybersecurity, anti-money laundering, and customer 360, just to name a few. If your enterprise relies on graph analytics or graph data science, then it needs a native graph database to ensure real-time performance for mission-critical applications. 

      M.R. Rangaswami is the Co-Founder of

      Read More

      M.R. Asks 3 Questions: Ayal Yogev, Co-Founder & CEO of Anjuna

      By Article

      Ayal Yogev is the co-founder and CEO of Anjuna, the leading multi-cloud confidential computing platform. Ayal firmly believes that the best security solutions are enablers – they open up new opportunities that wouldn’t exist without a heightened level of security and trust. To achieve this, the industry needs a new way of thinking, building, and delivering applications that keeps enterprises in the driver’s seat and keeps their data protected at all times. 

      Ayal is passionate about giving companies the freedom to run applications anywhere in the world with complete data security and privacy. That’s why he co-founded Anjuna.  

      With over two decades of experience in the enterprise security space, Ayal shares his thoughts on how confidential computing will impact the cybersecurity landscape. He explains how confidential computing will be the antidote to today’s patchwork of ineffective security solutions, and how it’s poised to make security an enabler of innovation rather than an inhibitor. 


      M.R. Rangaswami: Can you explain what confidential computing is and why it’s now seeing increased momentum? 

      Ayal Yogev: The majority of today’s cybersecurity solutions focus on detecting a breach once it’s already happened, then dealing with the repercussions. However, this approach leaves applications and data extremely vulnerable. Confidential computing addresses this vulnerability by processing data inside a hardware-isolated secure enclave, which ensures that data and code are protected during processing. Even in the event of a breach, applications running in confidential computing environments are invisible to attackers and therefore tamper-proof. 

      Confidential computing has seen rapidly growing support from cloud service providers and hardware manufacturers such as Intel, AMD, Nvidia, and Arm because of its massive, positive impacts on data security. However, it’s largely flown under the radar because of the engineering feat required to re-architect workloads to take advantage of it. Prior to Anjuna, it would take significant developer effort to re-code an application to work in just one of the clouds and then you’d have to repeat the work for each cloud you wanted to use. This is a daunting idea for many enterprises and a big reason why adoption has been slow. But this is changing. 

      Similar to VMware with server virtualization, Anjuna provides a new specialized software layer that allows enterprises to take advantage of the new hardware capabilities without the need to recode. Ajuna abstracts the complexity of confidential computing CPUs and democratizes access to this powerful technology that will redefine security and cloud. 

      M.R.: Which industries and companies are adopting this technology and what are the impacts they’ve seen?

      Ayal: According to IDC, less than half of enterprise workloads have moved to the cloud. Regulated verticals like financial services are only 20% of the way into their cloud journeys, meaning that 80% of workloads remain on-premises. Although running applications on-premise is less scalable, more complex and typically more expensive than in the cloud, CIOs are prevented from moving to the cloud by security, because in the cloud data security and privacy becomes a shared responsibility between you and your cloud service provider. Confidential computing finally solves this fundamental issue by isolating code and data from anyone with access to your infrastructure. 

      The value of confidential computing is broadly applicable and I expect that a few years from now confidential computing will be how all enterprise workloads run. In the short term, we see most security-conscious and heavily regulated organizations as the early adopters. Anjuna, for example, works with companies in financial services, government, blockchain, and other highly sensitive industries. 

      M.R.: When can we expect to see this technology impact our daily lives? What will this look like?

      Ayal: Confidential computing is already present in our everyday lives – we use it to protect our phones, credit cards, and more. This is now moving to the server side, and in the future it will move everything to the edge, creating a world of borderless computing.

      Adoption of confidential computing is at an inflection point. The ecosystem of manufacturers and cloud services providers has already moved. Intel, AMD, ARM, Nvidia, AWS, GCP, Azure, Oracle, and IBM have already shipped, or are about to ship, confidential computing enabled hardware and cloud services. What we’ve been missing is the software stack that democratized access to these new powerful capabilities, making it easy to use it for all apps without modifications. 

      I expect that over time, confidential computing will become the de-facto standard for how we run applications. The impact on our daily life will be huge. With ensured data security and privacy, organizations will not only be able to move more applications to the cloud, but also safely adopt emerging technologies like blockchain or AI. Moreover, entire new use cases like cross-organization data sharing and analytics will now be possible with incredible benefits in a wide range of industries like healthcare, financial services, media, and advertising.

      M.R. Rangaswami is the Co-Founder of

      Read More

      Quick Answers to Quick Questions: Jozef de Vries, Chief Product Engineering Officer, EnterpriseDB

      By Article

      Meet Jozef de Vries, the mastermind behind the cutting-edge product development at EnterpriseDB (EDB), a pioneering company revolutionizing Postgres in the enterprise domain.

      With over 15 years of experience before joining EDB, Jozef has held various positions at IBM, including building the IBM Cloud Database development organization from the ground up.

      In this quick Q&A, Jozef shares how enterprises can leverage Postgres to cater to their database needs and how this open-source platform is shaking up the market.

      M.R. Rangaswami: In your opinion, how will Postgres disrupt the open-source database market?

      Jozef de Vries: Postgres already has disrupted the database market. The only question that remains is how quickly Postgres will take a majority share of the enterprise database market. EDB is exclusively focused on accelerating the adoption of Postgres as the database standard in the enterprise. 

      Combined, Postgres is the most loved, most used, and most wanted database in the world. According to StackOverflow surveys of developers, its growth is exponential in 2022 and beyond.

      Postgres is the fastest-growing database management system in what Gartner views as an approximately $80 billion market. EDB customers such as MasterCard, Nielsen, Siemens, Sony, Ericsson and others have made Postgres their database standard

      EDB builds Postgres alongside a vibrant community, disrupting the market with greater scalability and cost savings compared to any other system. With more contributors to Postgres than any other company, EDB delivers unparalleled expertise and power to enterprises looking to adopt Postgres as their database standard. 

      M.R.: How does Postgres (as an open-source object-relational database system) function?

      Jozef: Postgres addresses the widest range of modern applications more than any other database today. This means that enterprises that run on Postgres can fundamentally transform their economics, build better applications with greater performance, scalability and security. 

      When Postgres was designed at the University of California, Berkeley more than 30 years ago, its designers made sure that the underlying data model was inherently extensible. At the time, databases could only use very simple data types, like numbers, strings and dates. Michael Stonebreaker, one of EDB’s distinguished advisors and strategists, and his team made a fundamental design decision to make Postgres easy to add new data types and their associated operations. 

      For example, PostGIS is an extension of Postgres that makes it easy to work with geographic data elements, polygons, routes, etc. That alone has made Postgres one of the preferred solutions for mapping systems. Other well known extensions are for document stores (JSON) and key value pairs (HSTORE).

      This extensible data model, together with the ability to run on every cloud, enables Postgres developers to be enormously productive and innovative.

      Alongside a robust independent open-source community, we have made Postgres an extraordinary database, superior to legacy proprietary databases and more universally applicable for developers than specialty databases.  

      Open source mandates, flexible deployment options, risk mitigation and strong security will drive much broader adoption of Postgres this year and next. EDB supports this with built-in Oracle migration capabilities, unmatched Postgres expertise and 24/7 global support. We uniquely empower enterprises to accelerate strategies, move applications to the cloud and build new applications on Postgres. 

      M.R.: What are the factors accelerating or inhibiting the adoption rate of Postgres?

      Jozef: Purpose-built for general use, Postgres powers enterprises across a wider variety and broader spectrum of applications than any other database, making it the economic game changer for data. There will always be specialty applications that require specialty databases. But for an enterprise standard, developers and IT executives rely on Postgres for the widest range of support. 

      Postgres technology is extraordinary and is improving faster than competing technologies, thanks to the independent nature of the community and EDB’s relentless commitment to Postgres innovation and development. Our technology approach delivers a “single database everywhere” to any platform including self-managed private clouds and self-managed public clouds, but our fully managed public cloud is the most important accelerator. The fact that we simultaneously deliver breathtaking cost reductions is the icing on the cake.

      Additionally, the fact that more developers love, use and want Postgres than any other database in the world is an important “tell” on this prediction. 

      Developers and business leaders alike seek data ownership and control and they simply don’t have time—or money—to waste. That is why they need a Postgres acceleration strategy, and only EDB can provide that.  

      Inhibitors to the adoption of Postgres are primarily awareness, staff education and training — all areas that the C-Suite can play a big leadership role in changing. Great leaders recognize the need for expertise from a company that deeply understands Postgres and enables them to run data anywhere. That’s EDB. 

      Our business is built to remove barriers. Some of the biggest companies in the world including Apple, Daimler, Goldman Sachs, and others have already adopted Postgres as their database standard. It’s not a matter of if, but when the majority of enterprises will follow suit.

      M.R. Rangaswami is the Co-Founder of

      Read More

      M.R. Asks 3 Questions: Rohit Choudhary, Co-Founder & CEO of Acceldata

      By Article

      Rohit Choudhary is the Founder & CEO of the market leader in data observability, Acceldata.

      Having data alone isn’t enough to deliver value to an enterprise – a report by HT Mint found that over 90% of the data available in the world today was generated over the last two to three years alone. But putting it all together is what drives results. For enterprises, data comes in all shapes and sizes—but in the era of hyper-information, disinformation can be equally destructive.

      Rohit credits his success to his engineering roots, continuous innovation, a humbling sequence of entrepreneurial learning from successes and failures, and cultural alignment that kept his team together for nearly 20 years. Fresh off its $50 Million Series C funding round, Acceldata is leading the charge for the data observability industry, giving operational control back to understaffed data teams while maximizing ROI.

      M.R. Rangaswami: What is Acceldata’s founding story and what led you to raise a significant $50 million Series C funding in the face of economic turmoil?

      Rohit Choudhary: My co-founders and I started Acceldata in 2018 after recognizing that a better solution was needed to monitor, investigate, remediate, and manage the reliability of data pipelines and infrastructure. Having built complex data systems at several of the world’s largest companies, it was clear that enterprises were trying to build and manage data products using tools that weren’t optimized for this task. Despite significant investments, data teams still couldn’t see or understand what was happening inside mission-critical analytics and AI applications, failing to meet reliability, cost, and scale investments. 

      Since our launch, we have seen tremendous company momentum and were fortunate to secure a significant Series C funding round in the midst of an economic downturn. As a result, I can confidently say we’ve built the world’s most comprehensive and scalable data observability platform, correlating events across data, processing, and pipelines to transform how organizations develop and operate data products. Our funding speaks to the true value that organizations across the globe are achieving with data observability, and we’re excited to push the industry even further into the limelight. 


      M.R.: What is the importance of having reliable and established data across the enterprise? What consequences will companies experience without it?

      Rohit: While an organization’s data is among its most valuable assets, data alone isn’t enough to deliver business value to an enterprise. Being able to piece it together to provide meaningful insights is what actually drives results and ROI. 

      With the migration of data and analytics to the cloud, data volume and data movement are more significant than ever. There is data-at-rest, data-in-motion, and data for consumption, each having different stops in the modern data stack that make it difficult for organizations to get a good handle on their data. Data reliability ensures that data is delivered on time with the utmost quality so business teams can make consistent, timely, and accurate decisions.

      In the era of hyper-information, disinformation can be extremely destructive. However, the quality and integrity of the data in hand are what define the return on investment for various analytics and intelligence tools. 

      M.R.: What steps can organizations take to structure a logical plan of action to manage, monitor, and demystify data quality concerns and data outages?

      Rohit: Data observability is the most logical plan of action to manage, monitor, and demystify data quality concerns, misinformation, and data downtimes. Software firms rely on observability as a solution to tackle data quality challenges and pipeline issues. Observability goes above and beyond just routine monitoring. It ensures teams are on top of breakdowns and manages data across four layers: Users, Compute, Pipeline, and Reliability.   

      Throughout the entire data process – from ingestion to consumption – data pipelines are moving data from disparate sources in an attempt to deliver actionable insights. When that data is accurate and timely, those insights help the enterprise gain a competitive advantage, and deliver the promise of an efficient data-driven enterprise. 

      M.R. Rangaswami is the Co-Founder of

      Read More

      Quick Answers to Quick Questions: Mark Greenlaw, VP of Global Marketing Strategy, Cirrus Data

      By Article

      It’s been said that 2023 is the year hybrid evolves to multi-cloud for enterprises, driving the importance of data migration to the forefront of IT decision-makers.

      Data is the lifeblood of the enterprise and now its movements have become even more complicated. In our quick conversation, Cirrus Data’s VP of Global Marketing Strategy, Mark Greenlaw, shared his observations on what’s happening with data mobility speeds, flexible storage architecture, and multi-cloud transformations.

      M.R. Rangaswami: What are companies missing about how digital transformation impacts cloud adoption? 

      Mark Greenlaw: The phrase “digital transformation” has become a sort of catchall to describe everything from the process of modernizing applications to creating new digital business models. In reality, digital transformation is not replicating an existing service, but using technology to transform the service into something significantly better.

      Unfortunately, less than 20% of companies that embarked on digital transformation strategies have been successful. There are varying reasons for the lack of sustained improvements from transformation initiatives, but infrastructure challenges are among the top. The cloud offers relief from rigid on-premises environments and accelerates time to market.

      Public cloud companies now offer flexibility, access to third-party ecosystems, automation, and the ability to truly transform services.


      M.R.: What do you advise organizations consider before a multi-cloud strategy?

      Mark: Companies have been moving to the cloud for several years, but not all clouds are equivalent. As cloud adoption has grown, different cloud services are ideal for applications, workloads, and business processes. Today, many organizations harness a mix of private, hybrid and public clouds. Selecting the right cloud service and understanding how it integrates into your environment is an important first step.

      It can be a challenge to determine which cloud is right for each scenario, but once you’ve made that decision executing the migration is often a roadblock. A ‘lift and shift’ strategy without optimization, often doesn’t yield the ROI anticipated. We often hear from organizations that they are surprised by the costs of the cloud. And, once they have moved their workloads to the cloud, moving them between clouds can be cost-prohibitive without the right data mobility solutions in place.

      As part of planning a cloud strategy, data mobility needs to be a key consideration. What is the strategy to de-duplicate and compress your workloads? Do you have a solution that will enable you to move data while it is in use? Can you move data between clouds without exorbitant egress fees? These are all questions that when tackled at the beginning will ensure your program’s success.  

      M.R.: Is moving block data to a new environment a high stake move?

      Mark: Block data refers to mission-critical databases and applications which are structured data owned directly by applications. The loss of block data can have a catastrophic impact on business operations. Historically, storage experts would spend months planning the migration of this data onto a new storage platform. Legacy migration processes were manual, time-consuming, and prone to human error. For one customer in the travel and leisure industry, their initial attempt to migrate their block data took 18 months and they only managed to move a quarter of the overall traffic. It had a serious impact on their digital transformation plans.

      It’s also important to consider the difference between data migration and data mobility solutions. Data migration is for one-time moves from one platform to another. Data mobility allows organizations to move data between platforms accurately and without delays. Data mobility is essential to maximizing a multi-cloud strategy.  Whether you need to move your data for a specific project or you want the flexibility of continuous data mobility, automation and moving data while it is in use dramatically accelerates the speed of the process.

      When you can automatically throttle the migration speed around usage, you have the ability to reduce the time spent and bandwidth used by up to 6x.  Designing a strategy to manage your data mobility at the beginning of your cloud journey will lead to increased ROI and a better overall experience.

      M.R. Rangaswami is the Co-Founder of

      Read More

      M.R. Asks 3 Questions: Armon Petrossian, CEO and Co-Founder of Coalesce

      By Article

      CEO and Co-Founder of Coalesce, Armon Petrossian, launched his company from stealth in January 2022 to solve the largest bottleneck in the analytic space: data transformations.

      The 29-year-old entrepreneur focused on helping enterprises overcome the pressing challenge of converting raw data into a more suitable structure for consumption, a process that can take months or even years, to meet daily organizational and operational data-driven demands. The company is currently going head-to-head with dbt Labs and Matillion in the data transformation space.

      M.R. Rangaswami: What are the core challenges you find that are associated with operationalizing data?

      Armon Petrossian: Companies have been struggling with data transformation and optimization since the early days of data warehousing, and with the enormous growth of the cloud, that challenge has only increased. Data teams, in particular, are challenged with the everyday demands of the business and the shortage of skilled data engineers and analysts to combat the growing volumes and complexity of data. 

      We are on a mission to radically improve the analytics landscape by making enterprise-scale data transformations as efficient and universal as possible.  We see the value of Coalesce’s technology as an inevitable catalyst to support the scalability and governance needed for cloud computing.

      One of the most rewarding aspects of my role at Coalesce is seeing the impact our solution has on organizations that want to drive value out of their data. This is especially true for companies that deal with complex data sets and/or are in highly regulated industries. 

      One of our most recent customer success stories involves partnering with an organization that helps big restaurant brand clients leverage their customer data to show that the brand knows and understands its customers. Helping its numerous clients improve their digital marketing funnel and offering customers a frictionless experience every time they visit the store, whether in person or online, relies heavily on data. This requires having the ability to glean useful insight from data quickly and easily. Coalesce, alongside Snowflake’s Snowpark, was able to help their data science team complete a high-profile transformation in one month, whereas before, the entire team spent 6 months without much progress.

      M.R.: What exactly is data transformation? Why does it play such a critical role in the future of data management and the analytics space?

      Armon: It’s important to look at how we consume data to understand why data transformations are so important. Initially, organizations that were adopting cloud platforms like Snowflake hit a major hurdle which was getting access to data from their source systems. As that problem has been largely solved by companies like Fivetran, and getting access to different types of data has become much easier, transforming that data to create a cohesive view is the logical next step for businesses to accomplish. This becomes dramatically more difficult as you begin to integrate data from traditional on-premises platforms, like Teradata or Oracle, along with a variety of different web sources. For example, companies may look at vast amounts of historical data to understand how their production line performs in certain scenarios or look into demographic information to target the right potential customers. Whatever the reason, the analytics are only as good as their ability to curate data from various sources and transform it into a consumable format for the analytics and data science teams.

      With Coalesce, the data can be organized in an easy-to-access and read fashion while using automation to streamline the process and limit the amount of time needed by highly skilled engineers. This ensures that companies are accessing high-quality data that is easy to use for a variety of purposes, an experience that is not guaranteed with existing tools. With our column-aware architecture, enterprises have the ability to efficiently and easily manage not only existing data but also new datasets as they grow and scale. 

      M.R.: What are your best practices for enterprises that are looking to keep up in today’s data-rich world?

      Armon: My suggestions for best practices can be broken down into four areas:

      i. Data-Competitive: Data competitiveness is key for every business, but given the enormous amounts of data being generated by modern enterprises, IT teams are falling behind in organizing and preparing data to be made available to business teams to help guide informed decisions.

      ii. Embrace the Cloud: Managing hardware or technology on-premises is expensive, time-consuming and risky. In U.S. history, cars were not nearly as impactful to daily life as a form of transportation until the infrastructure of roads was built across the country. We’re now seeing a similar economic boom with the way the cloud allows access to data for organizations that would have never been able to achieve similar use cases or value previously.

      iii. Evaluate Efficiency: IT teams finally understand how important efficiency can be to help deliver a continued competitive edge for enterprises. When applicable, data automation reduces time, effort, and cost while reducing tedious and repetitive work and allowing teams to focus on additional use cases with high-value data objectives.

      iv. Strive for Scalability: With more data and the proliferation of the cloud, organizations are challenged with scaling IT systems while maintaining flexibility and control. Companies should look to implement processes that offer the speed and efficiency needed to achieve digital transformation at scale and to meet increasing business and customer demands.

      M.R. Rangaswami is the Co-Founder of

      Read More

      M.R. Asks 3 Questions: AB Periasamy, Co-Founder & CEO, MinIO

      By Article

      AB’s Unicorn company has pioneered high-performance Kubernetes-native object storage, helping enterprises use the cloud operating model to determine where to run their workloads – depending on what they are optimizing for. 

      As a Series B company, MinIO has $126 million in funding raised to date, with a billion dollar valuation. Investors include Intel Capital, Softbank Vision Fund 2, Dell Technologies Capital, Nexus Venture Partners, General Catalyst and key angel investors.

      As one of the leading proponents and thinkers on the subject of open source software, AB is able to masterfully articulate differences between philosophy and business models – and how the two create cloud function.

      M.R. Rangaswami: Can you explain all this chatter about cloud repatriation?

      AB Periasamy: Simply put, the concept of “cloud repatriation” is repatriating workloads from public clouds to a private cloud. For years, the mantra of the cloud was fairly straightforward: put everything in the public cloud and keep it there forever. This model made sense as businesses optimized for elasticity, developer agility, service availability and flexibility. 

      Things changed when businesses reached scale, however, as the benefits were swamped by economics and lock-in. This is leading many enterprises to re-think their approach to the cloud – with a focus on the operating model of the cloud – not where it runs. 

      It’s important to remember the cloud operating model has a cycle. There are times to leverage the public cloud. There are times to leverage the private cloud. There are times to leverage the colo model. Given the ecosystem that has built up around the cloud – there is certainly self-interest in driving enterprise workloads in that direction – there are the consulting fees to get you there and the consulting fees to manage costs once you realize it is more expensive than forecasted. Nonetheless, sophisticated enterprises are increasingly taking their own counsel on determining what is best for the business – and that is driving the repatriation discussion. 

      M.R.: What are the key principles of the cloud operating model?

      AB: The cloud is not a physical location anymore. Today, the tooling and skill set that was once the dominion of AWS, GCP and Azure, is now available everywhere. Kubernetes is not confined to the public cloud distributions of EKS, GKE and AKS – there are dozens of distributions. MinIO, for example, works in the public cloud, private cloud and the edge. The building blocks of the cloud run anywhere.

      Developers know this. It is why they have become the engine of value creation in the enterprise. They know the cloud is about engineering principles, things like containerization, orchestration, microservices, software-defined everything, RESTful APIs and automation.  

      Understanding these principles and understanding that they operate just as effectively outside of the public cloud creates true optionality and freedom. There is no “one” answer here – but with the cloud operating model as the guide, enterprises create optionality. Optionality is good.

      M.R.: How has the cloud lifecycle changed and is repatriation the answer?

      AB: Early cloud native adopters quickly learned principles of the cloud. Over time, workloads grew and costs ballooned. The workloads and principles were no longer novel – but the cost to support the workloads at scale was.

      For enterprises, it has become clear that the value has been inverted by the costs of remaining on the cloud. This is the lifecycle of the cloud. You extract the agility, elasticity, and flexibility value, then you turn your attention to economics and operational acuity.

      Repatriation is but one tool. There are many. It is really about optimization. What you are optimizing for should help determine where you should run your workload. At MinIO, we are agnostic, you can find us in every cloud marketplace (AWS, Azure, GCP, IBM). You can find us on every Kubernetes distribution (EKS, GKS, AKS, OpenShift, Tanzu, Rafay). That is the definition of multi-cloud. 

      We talk about balancing needs and optimizing for workloads. Again, some workloads are born in the public cloud. Some workloads grow out of it. Others are just better on the private cloud. It will depend. 

      What matters is that when your organization is committed to the principles of the cloud operating model you have the flexibility to decide and with that comes leverage. And who doesn’t like a little leverage – especially in today’s economy.

      M.R. Rangaswami is the Co-Founder of

      Read More

      SEG’s 2023 Annual SaaS Report

      By Article

      As we round the corner on the first quarter of 2023, we thought it would be an appropriate time to check in and review Software Equity Group’s Annual Report.

      According to SEG’s report, SaaS continues to be an attractive asset class for private equity and strategic buyers. M&A deal volume in 2022 surpassed 2,000 transactions for the first time, a 21% increase over 2021.

      Private equity buyers with record amounts of dry powder drove volume and valuations, comprising nearly 60% of SaaS M&A deals, a record for annual activity, and accounted for some of the highest multiples in 2022.

      Public market indices across the board struggled to overcome the tumultuous macroeconomic landscape of 2022. While multiples continued to decline from the unsustainable run-up in 2021 (14.7x), public SaaS companies in the SEG SaaS Index demonstrated operational resiliency. The median EV/Revenue multiple sat 15% higher than 2018’s pre-pandemic levels, which were considered healthy at the time. What’s more, recent indicators show inflation moderating and the potential easing of interest rate hikes, which should bode well for SaaS multiples going forward.

      Here are 5 summary points to note:

      1. Private equity capital overhang and fierce strategic competition catalyzed SaaS M&A activity and buoyed EV/Revenue multiples
        in 2022, despite broader macroeconomic turbulence.
      2. SaaS M&A deal volume remains near peak levels, reaching 2,157 deals in 2022 and growing 21% over 2021.
      3. The median EV/Revenue multiple for SaaS deals jumped to 5.6x in 4Q22, surpassing the median SEG SaaS Index public market multiple of 5.4x. Buyers and investors paying a premium for high-quality assets bolstered valuation multiples for SaaS M&A in 2022.
      4. Private equity-driven deals accounted for the highest percentage of transactions to date on an annual basis (59.5%) due to the record amount of capital raised demanding deployment to worthy assets.
      5. Noteworthy deals include Adobe’s acquisition of Figma ($20B), Vista Equity’s acquisition of Citrix ($16.5B), and ICE’s acquisition of Black Knight ($16B).

      Click here to view SEG’s full 2023 SaaS Report:

      M.R. Rangaswami is the Co-Founder of

      Read More

      Quick Answers to Quick Questions: Dominic Lombardi, Vice President of Security & Trust, Kandji

      By Article

      Sharing his on his list what organisations must pay attention to when it comes to their security, Kandji’s VP of Security and Trust, Dom Lombardi, details how organizations can stay one step ahead of this year’s risks, threats and potential attacks.

      M.R. Rangaswami: With the higher risk of infrastructure attacks, what will be the biggest thing to stay ahead of to avoid a concerted effort of attacks against organizations?

      Dom Lombardi: Attackers will continue to become more creative in their pursuits. It has been reported that about 25% of all data breaches involve phishing and 82% of breaches involve a human element. Many of the security controls we put in place earlier are at risk of being bypassed due to human error. Financially motivated cybercriminals will concentrate on corporate entities, where they will try to derive personal identifiable information (PII) or customer payment card information.

      Further, “strategic viability” attacks against critical infrastructure systems will continue to increase. Think oil pipelines, power generation, rail systems, electricity production, or industrial manufacturing. There is still the possibility that key government or corporate services could be targeted — something tied to global tensions.

      M.R. Why is it important for companies to prioritize Zero Trust in their cybersecurity

      Dom: Security teams have been talking about the zero-trust cybersecurity approach for a few years. It used to be “trust, but verify.” The new zero trust — in a workplace filled with multiple teams, multiple devices, and multiple locations — is “check, check again, then trust in order to verify.”

      Organizations continue to play a cat-and-mouse game with hackers, attackers, and bad actors. Only 6% of enterprise organizations have fully implemented zero trust, according to a 2022 Forrester Research study.

      The complex and disparate workplace environments that are so common now make it difficult to adopt zero trust — at least all at once. If you are using AWS, Azure, and GCP with an on-premise instance along with a private cloud where you are running virtualization through VMware — that will take
      some time to uniformly roll everything out.

      As we all continue to embark on the zero trust journey, we will see new solutions for complex problems companies are experiencing on premise and in public and private clouds. By mastering basic IT (and security) hygiene, updating and communicating your risk register (a manual that outlines current and potential security risks and how they could impact the organization), and working steadily toward a zero-trust security model, you’ll be one step ahead of most other organizations — and hopefully two steps ahead of the hackers!

      M.R.: As companies continue to build their security plans, how will the role of the CISO
      expand at organizations

      Dom: The CISO can also (continuously) champion the risk register to ensure they receive needed resources to remediate and reduce risk on an ongoing basis. Keep in mind that new threats, risks, and updates will always populate your risk register. It is critical to actively work to remediate against this list to prevent risks from escalating and becoming even more complicated.

      Additionally, to prevent miscommunication and promote total transparency, any CISO who does not report directly to the CEO should demand that they do — immediately. Organizations need to take a risk-conscious approach to developing their security program and risk mitigation strategies.

      A CISO must report to the CEO to ensure direct lines of communication regarding risk scenarios and potential loss events. CEOs are ultimately accountable for the course of action they set the organization on, and CISOs provide the CEO with the direction and guidance to make informed, risk-conscious decisions.

      To set themselves up for success, CISOs should ensure that the general counsel at their organization is in their “peer set.” This relationship with your general counsel is integral to a unified approach to legal and security risk mitigation. The organization’s general counsel and CISO share a common goal: to keep the company, their customers, and the organization’s leaders safe.

      M.R. Rangaswami is the Co-Founder of

      Read More