Skip to main content

M.R. Asks 3 Questions: James Harold Webb, Chairman and CEO of Paradigm Development Holdings

By Article

James Webb says the difference between success and failure often comes down to whether the person thinks big in the early stage of the business.

Author of Redneck Resilience: A Country Boy’s Journey To Prosperity, James is an investor, philanthropist and successful multi-business owner. He began his entrepreneurial journey in the health industry as the owner of several companies focused on outpatient medical imaging, pain management and laboratory services.

Following successful exits from those companies, James shifted his focus to the franchise world and developed, owned and oversaw the management of 33 Orangetheory Fitness® gyms, which he sold in 2019. Not one to stop, he currently has two additional franchise companies in various stages of growth.

His insights as a life-long entrepreneur offers great insights for those looking to branch out with building businesses they own, and connecting it with their big-picture plan.

M.R. Rangaswami: What are the top two most common missteps a young entrepreneur makes in their first two years of business?

James Harold Webb: There are many mistakes an entrepreneur can make during the start-up stage of their business. Taking money “off the table” too quickly can lead to an assortment of problems, including holding back building your infrastructure, expansion, and cash shortage. Other than my “salary” (if needed), I tend to leave all the money back in the business for several years. The only exception to that is determining any income tax consequences and taking what I call a “tax distribution.” Solely for the purpose of paying the prior year’s income taxes or quarterly income tax payments.

I see too many 8to5ers who are not putting in the time or effort it takes to get a business off the ground and profitable. When you are ready to stop for the day, make one more phone call or send out one more email. Solve one more problem. Unbox one more package. Whatever it takes, just work harder than anyone else.

M.R.: How important is a leadership team in the early stages of building a business? What (if any!) budget should people allocate to that leadership team? 

James: Leadership is one of the key elements of a successful business. Creating a corporate culture from the beginning is crucial. Establishing relationships is also extremely high on the leadership list, whether it be with fellow corporate staff, employees, vendors, banking, or even competitors. Listen to people. Invest in people. Take the time to recognize people and to hold yourself accountable to them. Relationships will define your success.

M.R.: How can someone who is just starting their business beat the odds and not fail in the first five years? 

James: Work harder than anyone else.

Hope for the upside, but always plan for the downside. Stay focused on your upside and driving your business to success, but have a contingency plan for the “what ifs.”

Build a solid infrastructure before you reap the benefits of your venture. Find the right people who are dedicated to helping you reach your dream of success.

With employees, be clear in your expectations, hold them accountable, and be available to assist and direct as needed. Contrary to popular belief, you can be a boss and a “friend.” If they can’t get it done and you’ve done the previous, then it’s time to let them go.

M.R. Rangaswami is the Co-Founder of Sandhill.com

Read More

M.R. Asks 3 Questions: Sunil Sanghavi, CEO of NobleAI

By Article

2023 was undoubtedly the year that AI barnstormed our tech consciousness. Trained on massive amounts of public data; AI generated cool new images, wrote up content summaries, and delivered seemingly original work in the blink of an eye. Could this also be the future of helping companies balance the need for sustainable, green innovation against resource and supply chain constraints?

Artificial intelligence offers promise for accelerating materials/formulation R&D. But AI for science needs to be uniquely focused, applying small, curated use-case AI models mapping to multiple scientific principals at a time, to speed scientific discovery. This has the potential to be a game-changer across a wide range of fields, including medicine, agriculture, engineering and more, which is why Sunil believes that 2024 will be the year for specialized AI.

Sunil Sanghavi is currently CEO of NobleAI, a pioneer in Science-based AI solutions for chemical and material informatics. He has a rich and diverse operating background in deep-tech companies over 40 years. Most recently, he was Senior Investment Director at Intel Capital, where he invested in AI/ML hardware and software companies including Motivo, Untether AI, Syntiant, and Kyndi. He attended the MSc Chemistry program at the Indian Institute of Technology Bombay and obtained a BSEE from Cal Poly, San Luis Obispo.

M.R. Rangaswami: You have an impressive resume leading a variety of companies. What led you to NobleAI at this time? 

Sunil Sanghavi: Generative AI dominated the discussion in 2023, and will certainly continue to be a fascinating area to watch. At this point most people have experimented with the many available LLM-based tools and understand how they can help us with everyday tasks. But what I find most exciting is the opportunity to apply AI to speed scientific discovery. Science based AI (SBAI) has the potential to be a game-changer across a wide range of fields, including chemistry, materials, energy and many others to speed scientific discovery.  

That area is very exciting to me and is what drew me to NobleAI, where we’re showing the power of Science-Based AI (SBAI) to help companies achieve their goals. As opposed to large language models (LLMs), which is what GenAI is (basically scraping massive amounts of publicly available data), SBAI uses SSMs or Smaller Science-infused models where we apply the power of AI to private, industry- or company-specific data sets, and add to that applicable scientific laws and any available simulation data. This elegant process presents incredible opportunities for advancements to develop or improve chemicals, materials and formulations while also tackling pressing issues for companies like cost, supply chain and customer satisfaction. And unlike LLM-based solutions, SBAI is an optimized ensemble of models, optimized for each specific use case. Our ability to do this for literally hundreds of use cases in 3 or so person-months each and at a deterministic cost is what allows us to offer customized solutions while being able to scale NobleAI’s business.

M.R.: What are the challenges to innovation using SBAI?

Sunil: As is the case with any technological advances, it’s a change in mindset which will be the most immediate challenge. Scientists and researchers are trained to advance or eliminate solutions based on empirical experimentation. This can be cost-prohibitive, and is always by its nature time-consuming and limited in scope. In fact, research into chemical and specialized materials … an industry that spends $ 100 billion per year on R&D …  has not experienced much innovation in the past 50 years for this very reason. Developing chemicals and materials is incredibly complex, often requiring experimentation across a multitude of parameters so that researchers can understand interactions of hundreds of different ingredients interacting at scales ranging from molecular to formulations. But now, AI for science is opening the door to a better approach and NobleAI is leading the charge. The goal is to use AI to more rapidly explore a greater range of chemicals and materials in software (i.e., before going to the lab) saving potentially months or even years of R&D time. 

M.R.: Where do you see this really taking off first? What are the emerging trends that are most exciting?

Sunil: To me the most exciting possibilities are in the area of sustainability. There’s a big push to improve the safety of material ingredients for both the environment and human health. For instance, more people, organizations and regulators are now talking about the need to replace forever chemicals. But anytime there’s a need to replace an ingredient it can be a real challenge for companies to find substitutes. That’s why you often see the knee jerk reaction to fight a new environmental regulation. But the great thing about Science -Based AI is that we can turn that around. We can support companies and get behind sustainability initiatives. SBAI can not only help companies stay ahead of the shifting regulatory environment but we can support companies to get behind sustainability initiatives. I call what we do “Good AI, For Good”.

M.R. Rangaswami is the Co-Founder of Sandhill.com

Read More

The Annual SaaS 2024 Report

By Reports

Software Equity Group’s annual report is in, revealing that SaaS is here to stay.

As the report details, many in the technology industry, the story of 2023 was all about artificial intelligence, its rapidly advancing commercial applications, and the speed and extent with which it will impact the world we live in, both from a business and personal perspective.

4 SaaS components of 2023 that will impact what we see in 2024.

1. The advancement of generative AI and its impact on software and SaaS companies, both as users and creators of AI, was a top story in 2023 and one that will be front and center in 2024 as well.

However, quietly and perhaps a bit behind the scenes, another storyline proved to be just as important in 2023: the resilience of the U.S. economy and subsequent cementing of software and SaaS’s place as a key pillar driving digital transformation globally.

2. Inflation decreased by nearly half (with the CPI dropping from 6.5% in December 2022 to 3.4% in 2023), interest rates stabilized, and the labor market remained strong (unemployment rate a 3.7% with 216k jobs added in December).

3. Software and SaaS companies pivoted towards operational efficiency, and fortunately for the U.S. economy, many of these companies were successful in this endeavor. The result was a fantastic year for the SEG SaaS IndexTM, with the Index increasing 34% YOY, outpacing the S&P 500 and Dow Jones, and trailing only the Nasdaq (43% increase) among major indices.

On the M&A side, there were over 2,000 SaaS transactions, making 2023 the second strongest year on record for SaaS M&A, only narrowly trailing 2022.

4. While AI garnered a lot of the hype in 2023, an equally important story is the strength and resilience of the software ecosystem. 2023 was another proof point that SaaS is “here to stay.”

4 Macroeconomic Outlooks for 2024: Inflation, interest rates, employment, growth and politics

1. Inflation continues to decrease, finishing 2023 at 3.4% YOY compared to December 2022. The underlying core CPI, which strips out volatile food and energy prices, measured 3.9% in December 2023, its lowest YOY change since May 2021. Though additional cooling is still needed for inflation to reach the 2% annual target the Federal Reserve sets, the progress made in 2023 is encouraging.

2. The prospect of the Federal Reserve cutting interest rates is coming into focus. They will closely watch inflation and the unemployment rate (which remains solid at 3.7%) as it plots the course through this year.

The timing of potential cuts will greatly impact publicly traded SaaS stocks and the M&A markets, as the potential for a lower-cost borrowing environment would be a welcome sight to these markets.

3. What about a recession? 2023 growth is now expected to come in between 2 and 3%.
GDP growth is expected to decline slightly in 2024 but remains positive at around 2%. This scenario avoids a recession altogether and supports a healthy economic environment.

A scenario in which the U.S. beats GDP estimates again provides an upside case for publicly traded SaaS stocks in 2024. This possibility is further bolstered by the recently released Q4 GDP data, in which the U.S. GDP grew 3.3%, beating consensus estimates.

4. The economy will be a primary focus on the 2024 campaign trail. However, the reality remains that the Federal Reserve dictates monetary policy independent of political election cycles.

Election risk is still present due to the divisive nature of the current U.S. political environment,
albeit much less discussed than during the last cycle.

Globally, geopolitical risks include regional conflicts in the Middle East and their impact on oil prices, the ongoing Russia-Ukraine war, and tensions between China and Taiwan.

To read the details of Software Equity Group’s 2024 SaaS Report, click here.

Read More

5 Most Impactful Factors In Valuation of Technology Companies

By Article

The turbulent markets of 2022-2023 and volatility in the M&A environment has brought the topic of valuation to the forefront in many of our discussions with founders and investors.

Regardless of market ups and downs, the factors that are most impactful to valuation remain relatively constant, with some standards changing with market cycles as witnessed over the past decade. Safe to say, valuation continues to be both art and science.

Allied Advisers put together this article as a refresher on some of the most important valuation factors in the current market for technology companies; we hope our report also services as broad guidance to founders, executives and investors in achieving an optimal valuation outcome for their business.

It is often said that valuing a business is more an art than a science. Another assertion is that
valuation is in the eye of the beholder, akin to beauty. There is truth in both these statements since
enterprise valuation is impacted by several variables, not all of which can be quantified, and
perception of future prospects of a business can be quite different depending on the biases of the
evaluator.

Regardless of this sense of mystery and fuzziness about valuation, there are several fundamental
factors that influence the value of a technology business.

In this article, we cover five important elements that have a distinct bearing on the valuation of technology companies, noting that many of these factors apply to businesses in other sectors as well.

1) Scarcity in a Large Market
A business that is the only player, or one of just a few players, in a large end market is likely going
to be seen as being valuable since there are limited substitutes for the scarce solution offered by
that company. It is simple supply-demand dynamics – when there is clear demand for a product in
short supply, the price of that product goes up.

(Read more)

2) Significant Differentiation from Competitors
Often referred to as “USP” or unique selling proposition, differentiation of a technology business is
important to valuation since it creates scarcity and sets the business apart from its competition.
Differentiation may come from unique product features, ability to address challenging use cases,
performance metrics, superior UI design, ease of deployment and use, economic value to the
customer (time to value, ROI), etc.

(Read more)

3) Growth vs. Profit Margin and Rule of 40; Capital Efficient Growth
In the frothy market prior to COVID that eventually peaked in 2021, hypergrowth was the mantra
for technology companies. Businesses that grew at breakneck pace with no heed to bottom line
profitability attracted nosebleed valuations in private funding rounds. A popular performance
measure of software companies called Rule of 40 (revenue growth rate + profit margin > 40%) was
highly biased towards revenue growth; companies that grew at 100% with -50% operating margin
(R40 metric = 50%) were highly valued due to their growth, albeit with poor profit margins, easily
attracted capital.

(Read more)

4) Revenue Model and Gross and Net Revenue Retention Metrics
Business models typical to technology product/platform companies are subscription, licensing or
transactional. Subscription models provide recurring revenue (monthly or annually), licensing is
usually a one-time fee, and the transactional model provides revenue per transaction.

(Read more)

5) Customer Profile and Concentration
Companies that have large enterprises as customers are more likely to be able to expand revenues
from such clients given the numerous groups within large organizations and bigger budgets for
vendors. In contrast, having small/medium (SMB) customers limits the opportunities for large
contracts and wallet share expansion given limited budgets. For these reasons, companies with an
enterprise customer base have traditionally been viewed more favorably by investors compared to
businesses serving SMB clients.

(Read more)

To read the full report, click here.

Ravi Bhagavan is a Managing Director at Allied Advisers

Read More

M.R. Asks 3 Questions: Ofer Klein, CEO & Co-founder of Reco.AI

By Article

Ofer Klein is a decades-long Israeli Defense Force helicopter pilot and avid kitesurfing enthusiast who likens the adrenaline rush to being a founding CEO of a thriving security startup. It’s this unique background and experience that have been key to Ofer’s leadership style and Reco’s success. 

Ofer and his fellow co-founders developed the platform and AI algorithm to use for counterintelligence for the Israeli government, and decided to productize the platform in 2020, which lead to the birth of Reco.ai. Now, Reco.ai is a leading organization focused on safeguarding organizations with its modern, AI-driven SaaS security offering.

M.R. Rangaswami: What security concerns are not being talked about enough today?

Ofer Klein: There are a few. Security Keys Are Replacing Multi-Factor Authentication (MFA) – MFA is a common method of adding a second layer of security onto SaaS applications (in addition to a password). But, MFA is not the only security boundary, as SaaS applications are beginning to use security keys for secondary verification. Security keys are physical devices that use a unique PIN only available on that device to authenticate. 

Another is Microsoft 365 and Okta Cyber Attacks. A security concern is maintaining the security of core SaaS applications, such as Microsoft 365 and Okta, as they have more cyber threats because they are foundational to making SaaS programs run, potentially becoming the next SolarWinds. Despite growing security threats, these technologies have experienced an uptick in adoption. The security built into Microsoft 365 E5 and Okta isn’t enough, however, to keep the application and organizational data stored in it secure, prompting organizations to look for dedicated SaaS security solutions.

M.R.: Why is securing SaaS applications so important?

Ofer: During the pandemic, cloud collaboration tools fundamentally changed the way modern organizations work. Enterprises today are adding new applications to their technology stack at an unprecedented rate, using an average of 371 SaaS applications. This dramatic increase has resulted in an elevated demand for a security solution that provides full visibility into everything connected to a company’s SaaS environment, and at the same time, ensures it complies with regulations. 

Attempting to secure new SaaS tools with techniques that were developed for legacy on-premise systems restricts collaboration and misses a broad range of security events. Only by understanding the complete business context of an interaction can security analysts identify and interpret potential threats, and also determine the best and most efficient way to respond.

M.R.: What role does AI play in solving SaaS security?

Ofer: Like many sectors today, AI is revolutionizing the security industry. Leveraging AI to identify and address security vulnerabilities is rapidly growing and very effective. This is especially true for companies adding new generative AI applications into their technology ecosystems, as this can expose an organization to added risk due to the sharing of emails, recorded calls, and other data. Incorporating AI models, techniques, and processes like Large Language Models (LLMs), Knowledge Representation Learning, and Natural Language Processing (NLP) give companies greater visibility and allows them to discover potentially risky events (such as the improper use to AI tools) and be alerted to data exposure, misconfigurations, and mispermissions around a user.

The incredibly fast adoption of generative AI tools has led to new data risks, such as privacy violations, fake AI tools, phishing and more. As a result, organizations need to establish AI safety standards to keep their customer and employee data safe. Having a SaaS security solution that can identify connected generative AI tools is critical. 

AI is foundational to our SaaS security offering and enables enhanced functionality and effectiveness. Our proprietary and patented AI algorithm powers our Identities Interaction Graph, which correlates every interaction between people, applications, and data, and then assesses potential risk from misconfigurations, over-permission users, compromised accounts, risky user behavior, and also the use of generative AI applications. 
One-third of organizations regularly use generative AI applications in at least one function, making it critical for SaaS security platforms to have the ability to discover anomalous behavior for both humans and machines and gain even deeper proactive threat mitigation.

M.R. Rangaswami is the Co-Founder of Sandhill.com

Read More

M.R. Asks 3 Questions: Sanjay Sathé, Founder & CEO, SucceedSmart

By Article

Sanjay Sathé, Founder & CEO of SucceedSmart, is no stranger to disrupting established industries. Previously, Sathé spearheaded RiseSmart’s evolution from a concept based on his personal experiences into a major disruptor in the $3B outplacement industry, becoming the fastest-growing outplacement firm in the world. In September 2015, RiseSmart was acquired for $100M by Randstad.

Launching SucceedSmart, a modern executive recruitment platform with a unique blend of proprietary, patent-pending AI and human expertise, was a culmination of Sathé’s 15 years as a candidate of executive search and 15 years as a buyer of executive search. It was clear that the industry was living in the past and ripe for disruption.

While many organizations across the broader HR market were embracing technology, the executive search industry continued to operate almost entirely offline and saw a lack of innovation and technology adoption over the past 50 years.

Sathé invested time in researching both executives and corporate HR leaders to confirm his thinking and when he received a resounding “yes” to the hypothesis, he dove in to launch SucceedSmart in 2020. SucceedSmart is now on a mission to modernize leadership recruiting for director to C-level talent and fill complex leadership roles with unmatched agility, accuracy, and affordability, while promoting diversity and transparency

M.R. Rangaswami: How can artificial intelligence (AI) positively impact HR leaders and teams?

Sanjay Sathé: Businesses across industries have increasingly adopted AI in recent years. It’s no longer a question of whether to embrace AI technology—but when and how.

Contrary to the misconception that AI will eliminate jobs, AI can empower CHROs, talent partners, talent acquisition teams, hiring teams and other employees to work more strategically, and improve diversity and inclusivity. By automating routine tasks, AI also frees up time for HR professionals to focus on the “human” side of human resources and build relationships with candidates and employees.

From an HR perspective, AI automates tasks such as talent sourcing, resume screening, and interview scheduling, and helps centralize all candidate information in a streamlined platform. AI technology also unlocks insights about the hiring process and candidate experience to drive improvements over time. Leveraging AI also minimized conscious and unconscious biases in the hiring process by matching candidates with jobs that align with their accomplishments, skills, and experience.

M.R.: What are some of the top challenges in executive recruiting today and how can businesses overcome them?

Sanjay: Leadership has an immeasurable impact on business success and executives are among the most critical employees at any organization. Yet, despite increased turnover, business velocity, and competition, executive search has remained devoid of innovation and technological advancements for half a century.

The traditional executive search process can take several months—leading to a poor candidate experience, as well as lost productivity and revenue as roles go unfilled. The approach is transactional, exclusionary, clubby, time-consuming, and expensive. Not only is the pricing exorbitant, but in retained search, corporations may have to pay all those fees and still not get a candidate. And the same executives are often passed around between firms, leading to a limited talent pool.

Embracing modern executive recruitment technology can help address these challenges, decreasing total time to hire and overall hiring costs, and enable organizations to build more diverse leadership teams. It can also support diversity initiatives by focusing specifically on accomplishments and removing demographic and other personally-identifiable information that may lead to unconscious bias during the hiring process.

M.R.: How can businesses effectively build their leadership pipelines given the Silver Tsunami, meaning the wave of Baby Boomer employees retiring in the coming years? 

Sanjay: More than 11,000 Baby Boomers reach retirement age each day and more than 4.1 million Americans are expected to retire each year through 2027.

Traditional executive search primarily focuses on serving organizations—not executives. Firms often wait for executives to reach out to them and the same executives are often passed around between companies, resulting in a limited talent pool. As an increasing number of executives retire as part of the Silver Tsunami, traditional candidate networks are becoming even smaller. 

To improve talent sourcing across all roles amid the Silver Tsunami, organizations can turn to AI-powered candidate recruitment technology—rather than relying on personal connections. This approach enables organizations to be more proactive about succession planning by identifying and nurturing internal talent while simultaneously scouting for external candidates.

A modern executive recruitment platform can support the growing and urgent need to fill executive roles as more workers retire, by enabling corporations to build diverse pipelines of qualified executives and reduce total hiring time to a matter of weeks, compared to four to six months with traditional executive search firms. 

M.R. Rangaswami is the Co-Founder of Sandhill.com

Read More

PitchBook’s 2024 Industrial Technology Outlook

By Article

What does 2024 hold for industrial tech? PitchBook’s latest Emerging Technology Research looks ahead to what could be in store for verticals like agtech, clean energy, and more.

Here is a summary of Pitchbook’s Outlook on Agtech, the Internet of Things, Supply Chain Tech, Carbon & Emissions Tech, and Clean Energy.

AGTECH: Autonomous farm robots will see a major increase in adoption.

The anticipated surge in adoption of autonomous farm robotics in 2024 is driven by a convergence of compelling factors addressing critical challenges within the agriculture sector.

First, the persistent global labor shortages in agriculture are pushing farmers to seek alternative solutions, with farm automation offering a viable response to mitigate the impact of diminishing workforce availability.

Second, technological advancements, particularly in artificial intelligence, sensors, and automation, have matured to a point where the cost-effectiveness and reliability of robotic systems make them increasingly attractive for widespread adoption.

Third, the imperative to optimize resource use, reduce operational costs, and enhance overall farm efficiency aligns seamlessly with the capabilities of modern farm robotics, positioning them as essential tools for a more sustainable and productive agricultural future.

Fourth, the rise of Robotics-as-a-Service models is proving instrumental in easing upfront costs associated with adopting these technologies.

Fifth, pilot studies have successfully demonstrated the effectiveness of farm robotics, and companies are now transitioning to full-scale commercialization, making 2024 a pivotal year for the integration of these technologies into mainstream agricultural operations.

INTERNET OF THINGS: Outlook: Private 5G startups will produce a unicorn valuation in a late-stage deal or acquisition.

Unicorn valuations have been rare in the Internet of Things (IoT) industry with only two VC deals for Dragos and EquipmentShare valuing companies over $1.0 billion in North America and Europe in 2023. 5G startups have not reached this threshold despite achieving rapid valuation growth for midstage companies and a $1 billion exit in the space in 2020 for Cradlepoint. Numerous technical and commercial barriers to entry will ease over the coming year and revenue growth is on pace to accelerate.

The fundraising timelines of private leaders align with this trend, creating investment opportunities for growth-stage and corporate VC investors, along with telecommunications acquirers.

SUPPLY CHAIN TECH: Drone deliveries will go commercial in the US with more funding and investor interest in the space.

The Federal Aviation Administration (FAA) regulates the drone delivery market with a primary consideration on safety. To date, drones have been subject to a restriction called beyond visual line of sight (BVLOS) meaning an operator must have the drone within sight at all times when it is flying.

This restriction represents a significant (some might say insurmountable) hurdle for the development of a drone delivery marketplace. The cost of an operator visually tracking and monitoring every delivery via drone is prohibitive.

The FAA has stated that it wants to integrate drones into common airspace, and issued a number of exemptions to the BVLOS rule to startups and larger companies over the course of 2023.

These exemptions open the door for the market to finally develop.

CARBON & EMISSIONS TECH: Demand for carbon credits will recover, following uncertainty in 2022 and 2023.

Voluntary carbon markets (VCMs) have been under significant scrutiny in recent years, particularly carbon credits based on avoidance—rather than removal—of emissions.

Multiple different sets of standards, and the perceived risk associated with low-integrity credits, has been reducing the overall traded volumes of carbon credits, and has been pushing buyers toward removal-based credits that are easier to prove the integrity of.

New independent standards are emerging, and while there are no obligations for credit providers to follow them, they provide the means to show high integrity and reassure buyers.

CLEAN ENERGY: US clean hydrogen technology companies will become acquisition targets.

Low-carbon hydrogen is seen as a key component of global decarbonization efforts, particularly for certain industrial applications and heavy transportation. Earlier this year, the US Department of Energy allocated $7 billion to a program to develop seven hydrogen hubs across the US, to produce, store, and distribute hydrogen.

Companies involved in these hubs are varied, including energy and oil & gas companies that have experience with large-scale energy projects, but will likely look to close technology gaps through acquisitions.

To read PitchBook’s full report, click here.

PitchBook is a Morningstar company providing the most comprehensive, most accurate, and hard-to find data for professionals doing business in the private markets.

Read More

M.R. Asks 3 Questions: Jason Lu, CEO and Founder of CECOCECO

By Article

Jason Lu, the founder of CECOCECO, began his journey in the LED display industry in 2006 by creating ROE Visual. His commitment to perfection and a deep understanding of product quality quickly led to ROE Visual becoming a top brand within the industry.

As an innovator in the field, Jason has consistently been a notable figure in the industry and is never content to rest on past achievements. In 2021 he sought new challenges and founded CECOCECO. With this venture, Jason embraced the idea that LED displays could be more than functional tools; they could integrate technology and aesthetics to create emotionally engaging experiences.

Jason’s reputation for producing high-quality products is built on years of experience and industry knowledge. His dedication to product development was evident in the launch of ArtMorph by CECOCECO. After two years of dedicated work and maintaining high standards, Jason and his team successfully introduced this innovative product to the market.

Under Jason’s leadership, CECOCECO is more than a brand; it’s a testament to ongoing innovation in how the world experiences and interacts with light and display technology.

M.R. Rangaswami: What were the key insights or experiences that led you from ROE Visual to creating CECOCECO, and how do these past experiences shape your current vision?

Jason Lu: I’ve come to recognize that traditional LED displays, while functional, are not universally applicable to every space and often clash with sophisticated designs. My ambition is to develop products that harmoniously blend functionality with aesthetic appeal. I firmly believe that innovation is fueled by pressure. ROE is currently experiencing stable growth, prompting me to initiate transformative changes.

Reflecting on my past experiences, I’ve gained a profound understanding of the path to success and the attitude required for it. I’ve learned that success is not an overnight phenomenon. ROE took 17 years to reach its current stature, reinforcing my belief in the ‘slow and steady wins the race’ philosophy. I don’t equate financial gain with success. While survival is crucial, it’s not the epitome of success. My vision for CECOCECO is to relentlessly pursue excellence in our products, continuously innovate, and be a source of inspiration for the industry and the world at large.

M.R.: How does CECOCECO innovate in the LED lighting and display industry, and what future advancements do you foresee in this space?

Jason: At CECOCECO, our focus is on pioneering solution-based innovation. While similar products and projects exist, we question their viability and sustainability. Our approach involves crafting systematic solutions with an unwavering commitment to quality in every aspect, from the consistent output of our products to the intricacies of our manufacturing process. This is far more than a mere mechanical production; it necessitates a blend of human creativity and precision control. Our development and manufacturing stages demand extensive manpower, embodying a level of craftsmanship of the highest order. CECOCECO’s mission is to transform previously disjointed elements into cohesive, sustainable systems.

Looking ahead, we aim to diversify our product range. This includes offering a wider variety of resolutions and shapes and innovating with flexible screen technologies. Our goal is to provide a more comprehensive and diverse range of solutions to meet the evolving needs of our customers.

M.R.: What emerging trends in LED technology and lighting design do you find most exciting, and how is CECOCECO preparing to integrate these trends into future products?

Jason: The landscape of LED lighting is undergoing two significant transformations. First, there’s a notable shift from point light sources to surface light sources, with Chip-On-Board (COB) technology gaining increasing popularity. This evolution marks a fundamental change in how we perceive and utilize LED lighting. Secondly, the realm of lighting design is witnessing a surge of creativity. It’s transcending beyond mere color shifts and overlays; dynamic, imaginative light effects are becoming the norm, adding a refreshing dimension to lighting.

In response to these trends, CECOCECO is exploring integrating COB technology into our products to harness its unique effects. Lighting design isn’t just an aspect of our product; it’s a cornerstone. We’re committed to experimenting with various surface materials and designs to unlock new potential in creative lighting. Furthermore, we’re enthusiastic about collaborating with leading lighting designers. We aim to conceive and develop even more captivating lighting projects by merging our technological prowess with their creative flair.

M.R. Rangaswami is the Co-Founder of Sandhill.com

Read More

M.R. Asks 3 Questions: Ran Ronen, Co-Founder & CEO of Equally AI

By Article

According to Ran Ronen, 2024 will be the year in which technology leaders innovate by example to help create more inclusive experiences and broaden the base of potential users and customers of their technology services and solutions by prioritizing digital accessibility. 

Accessible websites and online experiences offer businesses a range of benefits, from ensuring compliance with regulatory requirements and industry best practices, to more users and customers accessing the site, to improved website SEO and brand trust and credibility. Prior to advancements made possible with AI, the technical process of ensuring a website operates as accessible has been a difficult goal for many website owners to achieve due to challenges to manage end to end accessibility compliance. 

Ran is the Co-Founder and CEO the world’s first no-code web accessibility solution designed to help businesses of all sizes meet regulatory compliance. This conversation was an enlightening one as he and I spoke about the positive shift he’s seeing in the tech field  to embrace more accessibility guidelines as best practices.

He is the Co-Founder and CEO of Equally AI, the world’s first no-code web accessibility solution designed to help businesses of all sizes meet regulatory compliance.

M.R. Rangaswami: What is the state of digital accessibility; and why, in today’s tech-driven world do you think adoption is still lagging to make accessibility a priority in user/customer experience? 

Ran Ronen: The state of digital accessibility is evolving, yet its integration into mainstream tech remains slower than it should be. Although AI-driven accessibility tools are emerging, many companies still see accessibility as a complex and costly process, often overlooking or delaying it in favor of rapid development. This overlooks the opportunity to appeal to a wider, more diverse customer base and enhance product usability for everyone from the onset.

Slow adoption also stems from limited awareness of diverse user needs and the wider benefits of accessibility beyond legal compliance. There’s a critical need for tech leaders to see accessibility not just as a necessity for individuals with disabilities, but as a key factor in improving overall user experience and innovation, which in turn boosts brand reputation and customer satisfaction.

M.R.: What are some challenges faced by organizations in managing the technology implementation side of digital accessibility? 

Ran: Organizations implementing digital accessibility often face several challenges, including a lack of in-house expertise on accessibility standards and implementation, which makes integrating these practices into existing tech frameworks difficult. Resource allocation is another challenge, as accessibility often competes with other business priorities and can be seen as an additional cost. Also, ensuring consistent accessibility across a diverse range of products and platforms presents a scalability challenge, requiring a strategic approach to meet various tech and user needs effectively.

M.R.: As an innovator in the space, what is your hope for the impact of AI in making more companies and their offerings more digitally inclusive? 

Ran: As an innovator in the digital accessibility space, my aspiration is that AI will enable a shift in perspective, where digital accessibility becomes not just an aspiration but a practical reality for more companies, especially small and medium-sized businesses. This will help them proactively create accessible products and services, which not only enhances the user experience for all but also opens up new markets and opportunities for innovation. 

M.R. Rangaswami is the Co-Founder of Sandhill.com

Read More

M.R. Asks 3 Questions: Ankit Sobti, Co-Founder and CTO of Postman

By Article

Ankit Sobti is co-founder and CTO for Postman, the world’s leading API Platform. Prior to joining Postman, Ankit worked for Adobe and Yahoo!, where he served as a senior software engineer. In his current role, Ankit focuses on product and development, leading the core technology group at Postman.

A key focus for this Q&A are the findings from a recent global survey Ankit and the Postman team published, tracking the most important trends around API use in large enterprises.

 

M.R. Rangaswami: APIs are critical tools for enterprise success, but should they also be considered products?

Ankit Sobti: Thinking about APIs as products helps to understand and articulate that APIs, like any other item you’d typically call a product – a website, a mobile app, a physical product – are required to be built with a consumer-driven mindset. 

This requires an understanding of who the consumers are, what problems are they trying to solve, why is it a problem in the first place, what else are they doing to solve this problem–and then consciously and deliberately designing a solution to this problem exposed through the interface of an API.

And like any other product, APIs also need to be packaged, positioned, priced, distributed, and iteratively improved to evolving consumer needs. 

Postman’s 2023 State of the API Report, which surveyed over 40,000 people found 60% of the API developers and professionals view their APIs as products – which I think is a good signal that this realization is well underway. And it makes sense that APIs are increasingly seen as products, serving both internal and external customers. 

But how does this view vary by industry and company size? And how much revenue can APIs generate? It turns out that the larger the company, the likelier it is to view its APIs as products. At companies with over 5,000 developers, 68% of respondents said they considered their APIs to be products. At the other end of the spectrum were companies with fewer than 10 employees. There, just 49% of respondents viewed their APIs as products. 

M.R.: Are APIs actual revenue generators now for companies?

Ankit: Yes, APIs are increasingly unlocking new streams of revenue and business opportunities for companies. In some of the more traditional industries with lower margins for example, we are increasingly seeing APIs being used as a high margin revenue stream. And there are numerous examples now of companies where the primary product being sold is the API.

APIs that package insights or key capabilities and can be used to drive strategic partnerships, or allowing companies to become platforms on top of which others can build. We are seeing examples of this ranging from small development shops all the way to large enterprises. 

This is something we also saw in our survey, with 65% of the respondents affirming their APIs generate revenue, and almost 10% of companies with money-making APIs said their APIs generated more than three-fourths of total revenue. 

M.R.: Does an API-first approach impact revenue?

Ankit: API-first companies are defined as those that use APIs as the building blocks of their software strategy. APIs bind together not only the internal components of an organization, but also pave the way for seamless external collaboration. And thinking in terms of these building blocks, an API-first approach allows for easier externalization of the capabilities that APIs provide, and subsequently create easier paths for revenue.

In addition, we believe that API-first companies have superpowers that foster happier developers and a healthier business ecosystem. In our customer base, we work with companies across a broad range of industries – and APIs generate significant amounts of revenue, unlock new business opportunities, and drive ecosystem expansion through partnerships.

And for companies with APIs, it’s worth weighing how much to invest in them, and adopting an API-first approach. These decisions may have a tangible impact on the bottom line. 

M.R. Rangaswami is the Co-Founder of Sandhill.com

Read More

Innovate, Engage & Succeed: Embracing the PLG Paradigm – 2H 2023

By Article

Allied Advisers has just released its inaugural report on product led growth (PLG).

Product-Led Growth (PLG) is an innovative customer-centric business strategy that employs user-friendly products to acquire, retain, and expand the customer base, reducing the reliance on traditional sales and marketing.

As software users, we have had magical experiences with products that allow us to independently explore, test, purchase and expand usage without intervention from the product vendor’s sales team; these PLG strategies have been utilized successfully by leading SaaS companies such as Dropbox, Zoom, Klaviyo and Slack among others. This contrasts with sales led growth (SLG) that relies on direct sales teams to hunt and harvest product sales opportunities.

This report covers insights on how to develop a PLG strategy from Dharshan Rangegowda, a former Allied Advisers client who grew ScaleGrid via a PLG strategy before raising a growth round with a mid-market PE firm.

Additionally, the report provides details on transactions of PLG companies as well as profiles of certain PLG businesses in different verticals, indicating significant differences in operational efficiencies when adopting a PLG model.

To read the full report, click here.

Read More

M.R. Asks 3 Questions: Godard Abel, CEO of G2

By Article

A 5x SaaS entrepreneur, Godard Abel is CEO of G2, the world’s largest and most trusted software marketplace, which he co-founded in 2012. He is also Executive Chairman of ThreeKit, a leading 3D visualization technology company, and Logik.io, a next generation configuration technology.

Previously, Godard served as CEO of SteelBrick which was acquired by Salesforce in 2016. Prior to SteelBrick, Godard co-founded BigMachines, where he served as CEO and built it into a leading SaaS provider which was acquired by Oracle in 2013. He also served as a GM at Niku prior to its IPO in 2000 (and subsequent acquisition by CA).

Before entering the technology industry, Godard consulted for McKinsey & Company and advised leading manufacturers in the U.S. and Germany on strategy development and business process improvement. Godard was a Finalist for EY Entrepreneur of the Year in 2019, named to the Tech 50 list by Crain’s Business Chicago in September 2014, and to the Chicago Entrepreneur Hall of Fame in 2011. He earned an MBA from Stanford University and both a B.S. and M.S. in engineering from the Massachusetts Institute of Technology.

As you can tell by our conversation, Godard is not only an innovator and leader in the tech world, but he is also very skilled at sharing a lot of information in few words.

M.R. Rangaswami: How is software buying changing?

Godard Abel: B2B buyers now expect consumer-like shopping experiences, where they can conduct research and make purchases quickly, conveniently, and on their own terms. This means expensive software solutions can be bought with a credit card, and the buyer conducts research on review sites and other peer communities. In fact, G2 research finds that 67% of global B2B software buyers usually engage a salesperson once they have already made a purchasing decision. 

M.R.: How does AI impact this shift in software buying behavior? 

AI will only accelerate the ongoing shift to self-serve software research and buying, delivering modern digital buyer experiences. The ability of AI to provide immediate, data-driven insights is a key driver of this shift. With this in mind, software vendors have an opportunity to lean into AI to meet buyers’ preferences for speed, eliminating friction in the software buying journey. 

M.R.: What role does G2 play in this evolving software landscape? 

Godard: G2 has over 2.4 million verified reviews on 150,000+ products and services. All 1 billion knowledge workers around the world need software and they’re coming to G2 to research it. With our massive dataset on B2B software and the most traffic from software buyers, G2 is uniquely positioned to power software buying and selling in the age of AI. 

Earlier this year, we introduced Monty, the first-ever AI-powered software business assistant built on OpenAI’s ChatGPT. Previously, a buyer would visit G2.com and search for the type of software they were looking for – CRM, for example. However, not every buyer knows exactly what they need.

With Monty, you can now describe the business challenge you’re looking to solve and have a conversation. Powered by G2’s extensive dataset, Monty can recommend the best software solutions for your particular need – making the process of research software faster, easier, and more effective.

M.R. Rangaswami is the Co-Founder of Sandhill.com

Read More

M.R. Asks 3 Questions: Jay Wolcott, Co-Founder & CEO, Knowbl

By Article

What does the future of customer experience look like with generative AI?

According to Knowbl’s CEO and Co-Founder, Jay Wolcott, it’s going to critical to understand the risk in implementing AI solutions and the requirements for what “enterprise-ready conversational AI” means.

In this conversation, Jay sheds light on how this innovative technology redefines customer experience, making interactions more seamless, convenient, and efficient.

M.R. Rangaswami: What exactly is “BrandGPT,” and how does it differ from traditional conversational AI technologies? 

Jay Wolcott: BrandGPT is a revolutionary Enterprise Platform for Conversational AI (CAI) built leveraging large language models (LLMs) from the ground up. Legacy virtual assistance platforms built upon BiLSTNs and RNN frameworks like the speed, ease, and scalability that LLMs can offer through few-shot learning. 

Through the release of this all-new approach, CAI can finally meet its potential of creating an effortless self-service experience for consumers with brands. The proprietary AI approach Knowbl has designed within BrandGPT offers truly conversational and contextual interactions that restrict the limits of Generative AI from uncontrollable risks. 

This new approach is driving tons of enterprise excitement for new levels of containment, deflection, and satisfaction across digital and telephony deployments. Beyond the improved recognition and conversational approach, Knowbl’s platform allows brands to launch quickly, leverage existing content, and improve the scalability of capabilities while reducing the technical effort to manage. 

M.R.: What emerging trends do you foresee shaping the future of conversational AI and customer experience, and how can businesses prepare for these developments?

Jay: In 2024 we plan to overcome customer frustration with brand bots and virtual assistants, ushering in a new era of effortless and conversational experiences powered by advanced language models.

Brands that embrace LLMs for customer automation early on will establish a competitive advantage, while those who lag will struggle to keep up. Although many organizations are still in the experimental phase of using GenAI for internal purposes due to perceived risks, leading brands are boldly venturing into direct customer automation, reimagining digital interfaces with an “always-on” brand assistant.

We also predict 2024 to be the year that bad bots die. New expectations of AI will lead to frustrated consumers when dealing with legacy bots, and a trend in attrition versus retention will appear.

M.R.: What complexities do multinational companies face when implementing AI-driven solutions, and how can they navigate the challenges to ensure successful adoption across diverse markets?

Jay: Multinational companies encounter a myriad of complexities when implementing AI-driven solutions stemming from the diversity of markets they operate. One significant challenge lies in reconciling varied regulatory landscapes and compliance requirements across different countries, necessitating a nuanced approach to AI implementation that adheres to local regulations. 

Additionally, cultural and linguistic diversity poses a hurdle, as AI solutions must be tailored to resonate with the unique preferences and expectations of diverse consumer bases. To successfully navigate these challenges, companies must prioritize a robust localization strategy, customizing AI solutions to align with each market’s specific needs and cultural nuances. 

Collaborating with local experts, remaining vigilant of regulatory changes, and fostering open communication with stakeholders is essential for multinational companies to achieve successful AI adoption across diverse markets.

M.R. Rangaswami is the Co-Founder of Sandhill.com

Read More

M.R. Asks 3 Questions: John Hayes, Founder & CEO, Ghost Autonomy

By Article

John Hayes is CEO and founder of autonomous vehicle software innovator Ghost Autonomy.

Prior to Ghost, John founded Pure Storage, taking the company public (PSTG, $11 billion market cap) in 2015. As Pure’s chief architect, he harnessed the consumer industry’s transition to flash storage (including the iPhone and MacBook Air) to reimagine the enterprise data center inventing blazing fast flash storage solutions now run by the world’s largest cloud and ecommerce providers, financial and healthcare institutions, science and research organizations and governments.

Like Pure, Ghost uses software to achieve near-perfect reliability and re-defines simplicity and efficiency with commodity consumer hardware. Ghost is headquartered in Mountain View with additional offices in Detroit, Dallas and Sydney. Investors including Mike Speiser at Sutter Hill Ventures, Keith Rabois at Founders Fund and Vinod Khosla at Khosla Ventures have invested $200 million in the company.

Now, let’s get into it, shall we?

M.R. Rangaswami: How does the expansion of LLMs to new multi-modal capabilities extend their application to new use cases?

John Hayes:  Multi-modal large language models (MLLMs) can process, understand and draw conclusions from diverse inputs like video, images and sounds, expanding beyond simple text inputs and opening up an entirely new set of use cases from everything from medicine to legal to retail applications. Training GPT models on more and more application specific data will help improve them for their specific task. Fine-tuning will increase the quality of results, reduce the chances of hallucinations and provide usable, well-structured outputs.

Specifically in the autonomous vehicle space, MLLMs have the potential power to reason about driving scenes holistically, combining perception and planning to generate deeper scene understanding and turn it into safe maneuver suggestions. The models offer a new way to add reasoning to navigate complex scenes or those never seen before.

For example, construction zones have unusual components that can be difficult for simpler AI models to navigate — temporary lanes, people holding signs that change and complex negotiation with other road users. LLMs have shown to be able to process all of these variables in concert with human-like levels of reasoning.

M.R.: How is this new expansion impacting autonomous driving, and what does it mean for the “autonomy stack” developed over the past 20 years?

John:  I believe MLLMs present the opportunity to rethink the autonomy stack holistically. Today’s self-driving technologies have a fragility problem, struggling with the long tail of rare and unusual events. These systems are built “bottoms-up,” comprised of a combination of point AI networks and hand-written driving software logic to perform the various tasks of perception, sensor fusion, drive planning and drive execution – all atop a complicated stack of sensors, maps and compute.

This approach has led to an intractable “long tail” problem – where every unique situation discovered on the road requires a new special purpose model and software integration, which only makes the total system more complex and fragile. With the current autonomous systems, when the scene becomes overly complex to the point that the in-car AI can no longer safely drive, the car must “fall-back” – either to remote drivers in a call center or by alerting the in-car driver. 

MLLMs present the opportunity to solve these issues with a “top-down” approach by using a model that is broadly trained on the world’s knowledge and then optimized to execute the driving task. This adds complex reasoning without adding software complexity – one large model simply adds the right driving logic to the existing system for thousands (or millions) of edge cases.

There are challenges implementing this type of system today, as the current MLLMs are too large to run on embedded in-car processors. One solution is a hybrid architecture, where the large-scale MLLMs running in the cloud collaborate with specially trained models running in-car, splitting the autonomy task and the long-term versus short-term planning between car and cloud.

M.R.: What’s the biggest hurdle to overcome in bringing these new, powerful forms of AI into our everyday lives?

John: For many use cases, the current performance of these models is already there for broad commercialization. However, some of the most important use cases for AI – from medicine to legal work to autonomous driving – have an extremely high bar for commercial acceptance. In short, your calendar can be wrong, but your driver or doctor can not. 

We need significant improvements on reliability and performance (especially speed) to realize the full potential of this technology. This is exactly why there is a market for application-specific companies doing research and development on these general models. Making them work quickly and reliably for specific applications takes a lot of domain-specific training data and expertise. 

Fine-tuning models for specific applications has already proven to work well in the text-based LLMs, and I expect this exact same thing will happen with MLLMs. I think companies like Ghost, who have lots of training data and a deep understanding of the application, will dramatically improve upon the existing general models. The general models themselves will also improve over time. 

What is most exciting about this field is the trajectory — the amount of investment and rate of improvement is astonishing — we are going to see some incredible advances in the coming months.

M.R. Rangaswami is the Co-Founder of Sandhill.com

 

Read More

M.R. Asks 3 Questions: Gerry Fan, CEO of XConn Technologies

By Article

Gerry Fan serves as the Chief Executive Officer at XConn Technologies, a company at the forefront of innovation in next-generation interconnect technology tailored for high-performance computing and AI applications.

Established in 2020 by a team of seasoned experts in memory and processing, XConn is dedicated to making Compute Express Link™ (CXL™), an industry-endorsed Cache-Coherent Interconnect for Processors, Memory Expansion, and Accelerators, accessible to a broader market.

In pursuit of expediting the adoption of CXL, Gerry and his teams have successfully introduced the world’s inaugural hybrid CXL and PCIe switch – with a strategic approach that will make computers faster, smarter, and better for the environment.

M.R. Rangaswami: What barriers are being faced by AI and HPC applications that you are looking to
solve?


Gerry Fan: Next generation applications for artificial intelligence (AI) and high-performance computing
(HPC) continue to face memory limitations. The exponential demand these applications place on memory bandwidth has become a barrier to their further innovation and widespread adoption.

The CXL specification has been developed to alleviate this challenge by offering unprecedented memory capacity and bandwidth so that critical applications, such as research for drug discovery, climate modeling or natural language processing, can be delivered without memory constraints. By applying CXL technology to break through the memory bottleneck, XConn is helping to advance next-generation applications where a universal interface can allow CPUs, GPUs, DPUs, FPGAs and other accelerators to share memory seamlessly.


M.R.: How are you looking to solve the challenge with the industry’s first and only hybrid CXL
and PCIe switch?


Gerry: While CXL technology is poised to alleviate memory barriers in AI and HPC, a hybrid approach that combines CXL and PCIe on a single switch provides a more seamless pathway to CXL adoption. PCIe (Peripheral Component Interconnect Express) is a widely used interface for connecting hardware components, including GPUs and storage devices. Many traditional applications only need the interconnect capability offered by PCIe. Yet, emergingly, next-generation applications need the higher bandwidth enabled by CXL. System designers can be stuck with what approach will be the highest need.


XConn is meeting this challenge by offering the industry’s first and only hybrid CXL 2.0 and PCIe Gen 5 switch. Combining both interconnect technologies on a single 256-lane SoC, the XConn switch is able to offer the industry’s lowest port-to-port latency and lowest power consumption per port in a single chip – all at a low total cost of ownership. What’s more, system designers only have to design once to achieve versatile expansion, heterogeneous integration for a mix of accelerators, and fault tolerance with the redundancy mission critical applications require for true processing availability.

M.R.: In your view, how will XConn revolutionize the future of high-performance computing and AI
applications?

Gerry: Together with other leading CXL ecosystem players, XConn is delivering on CXL’s promise to support faster, more agile AI processing. This will deliver the performance gains AI and HPC applications
needs to accelerate research and innovation breakthroughs. It will also support greater energy efficiency and sustainability while helping to proliferate the “AI Everywhere” paradigm for smarter and more autonomous systems.

By helping to foster innovation and accelerate application use cases, XConn is delivering the missing link that will pave the way for unprecedented computing performance needed for tomorrow’s breakthroughs and technology advancements.

M.R. Rangaswami is the Co-Founder of Sandhill.com

Read More

M.R. Asks 3 Questions: Razat Gaurav, CEO of Planview

By Article

When I sat with Razat he was clear on the imperativeness of the digitalisation in almost every organisation in every industry today, and that is what is leading to more than $3trn of annual spending on it.

His rationale behind digitalisation is sound but as he shared, studies show that much of that work is wasted-more than 40%, in some cases. This is largely due to the disconnect between strategy and what’s being executed by teams across the business.

As the leader in portfolio management and value stream management, this conversation Razat Gaurave shares why bridging the strategy-execution gap is essential for organizational and leadership transformation.

Would you believe that 40% of strategy work gets wasted in execution?

M.R. Rangaswami: What is the biggest challenge orgs face when connecting strategy to execution?

Razat Gaurav: The biggest challenge between strategy and execution is change-change from technology shifts, demographic shifts, and even generational shifts. It’s not a new phenomenon. But what has changed is that the pace of change is exponentially faster. Companies must be able to quickly analyse and adapt or evolve their strategy-and how those changes are executed-while still driving important business outcomes.

M.R. The research arm of The Economist, found 86% of executives think their organizations need to improve accountability for strategy implementation. What challenges do orgs face around measurement?

Razat: The key thing that gets in the way are data silos. Most organisations are swimming in data, yet most of that data is not usable to make decisions. Curating the relevant data to align with your priorities and objectives is critical to achieving accountability for strategy implementation.

What we find is that many organisations have three major gaps when they look at how they measure understanding of strategic goals.

First, organisations are measuring inputs or outputs, but they’re not measuring outcomes. Particularly when dealing with digital transformations, the business and technology teams must work together to focus on the outcome.

The second gap is around creating a synchronised, connected approach to objectives and key results-what some organisations call OKRs. Is leadership in alignment with the way an individual contributor gets measured? And does the individual contributor understand how they impact their leadership’s OKRs? That bidirectional synchronisation is key

And then the last piece is how the different functions in the organisation-finance, manufacturing, sales, and so on-align their OKRs to help achieve the company’s objectives and key results.

M.R.: What should leaders do first to narrow the strategy-execution gap?

Razat: My first piece of advice would be, take a deep breath because change is constant.

As organisations, as leaders, as individuais, we all have to be ready to adapt and change. But beyond taking that deep breath, there are three things I’d advise organisations to do.

First, figure out the three initiatives that wil actually move the needle. Second, define OKRs and an incentive structure for the outcomes you’re trying to achieve. Third, invest in systems that allow you to break out of those data silos to execute as one organisation, as one team.

M.R. Rangaswami is the Co-Founder of Sandhill.com

Read More

SMB SaaS: The Younger and Sometimes Overlooked Sibling of Enterprise SaaS

By Reports

According to the recent update from Allied Advisers, SMB is the backbone of the US economy; 99.9% of all US businesses are in this segment. With rising SaaS adoption by small businesses for enhancing productivity, we remain optimistic on the long-term view of this sector.

While not surprisingly, SMB SaaS has higher churn than Enterprise SaaS, SMB SaaS has significantly better operational metrics when it comes to sales and marketing expense, R&D expenses, EBITDA margins and less sector competition. Our report covers nuances of SMB SaaS and we believe that SMB SaaS businesses continue to offer compelling opportunities for investors and buyers.

This particular Allied Advisers report has updated their SMB SaaS, highlighting a sector that has been growing with notable outcomes.

The report pulls from IPO’s of: Freshworks ($1.03B), Digital Ocean ($780M), Klaviyo ($576M) and notable exits such as Mailchimp’s acquisition ($12B+, one of the largest boot-strapped exits) and growth of private SMB SaaS companies like Calendly (last valued at $3B), Notion (last valued at $10B).

To see the full summary of Allied Adviser’s update, click here:

Gaurav Bhasin is the Managing Director of Allied Advisers.

Read More

State of SaaS M&A: 4 Buyers’ Perspectives

By Article

One year ago, Software Equity Group started their 2022 report on M&A trends with a simple observation: the stock market activity was not for the faint of heart. That view led to a much broader inquiry throughout the report into the myriad of dynamics at play and the associated impact on the software M&A market.

So how are Founders and CEOs exercising caution when considering M&A and liquidity events in the face of ongoing economic uncertainty, and is their restraint warranted?

To cut to the chase: it depends. For software businesses with the right profile (more on that later), there is tremendous opportunity in the current M&A landscape.

To better assess the state of the market, SEG analyzed data from our annual survey of CEOs, private equity investors, and strategic buyers, in addition to our quarterly report and our transactions.

HERE ARE SEG’S 4 TAKEAWAYS FROM THE RESEARCH:

1. Cautious CEOs Are Holding Off On Going To Market

    Not surprisingly, the macroeconomic environment has colored their perceptions of the SaaS M&A market. Seventy-eight percent believe valuations are the same or lower than last year, and over two-thirds believe the market will improve in the coming years.

    As a result, many are waiting to explore and see what the future holds before going to market.

    2. Buyers And Investors Face Shortage Of Opportunities

    In contrast to the CEOs’ viewpoint, buyers and investors are finding that the competition is holding steady or getting stronger. They are eager to do deals with high-quality businesses, but there are not as many opportunities available as in 2022.

    Meanwhile, 66.7% of strategics say they have seen no change or a decrease in the volume of high-quality SaaS companies in the market over the past year. This supports the idea that high-quality M&A opportunities are scarce in 2023 and high-quality businesses that pursue a liquidity event receive outsized interest from buyers and investors.

    3. Growth, Retention & Profitability Are Key

    Given the uncertainty in the macro markets over the last 18 months, it is not surprising that buyers have become more risk-averse, and the profile of a highly desirable asset has shifted.

    Nevertheless, while revenue growth and retention are weighted strongly, there is little interest in businesses burning significant cash. In 2020 and 2021, the high-burn, growth-at-all-cost model was considered an attractive asset. In 2023, the story has now changed.

    4. High-Quality Assets Are Demanding Premium Valuations

    The current market represents a classic supply and demand dynamic. When the supply of a good decreases, and the demand for said good stays the same or increases, its price is expected to increase.

    Where is the data that supports it?

    The answer is hard to find in the public markets. The share prices of public SaaS companies in the SEG SaaS Index have rebounded this year but are still down roughly 36% from COVID-level peaks.

    The Nasdaq has sharply rebounded from 2022 lows, due to the “Magnificent 7” companies and excitement over artificial intelligence. Most notably, valuations in M&A deals have decreased by 36%
    since 2021.

    There Is Good News For SaaS Companies.

    It is easy to understand why CEOs are cautious right now, and many are right to be. The landscape has shifted from where it was a few years ago, with buyer and investor priorities shifting as well. It is clear, however, that the deficit of profitably growing assets on the market is working in favor of sellers.

    This is due to increasing competition for highly sought-after software companies that display strong revenue growth and retention. One thing everyone agrees on: higher valuations lie ahead.

    To read the full SEG review on SaaS M&A: 4 Buyers’ Perspectives, click here.

    Read More

    Status Check: 5 Early Predictions for 2023

    By Article

    In January 2023 Leigh Segall, Chief Strategy Officer at Smart Communications – a leading technology company focused on helping businesses engage in more meaningful customer conversations, shared her predictions on what businesses would be focusing on with customer experience in 2023.

    We’ve kept these in our back pocket knowing that as we round out Q4, it would be useful to reflect and review where customer experience strategies are currently at in this climate.

    1. Ever-changing customer behaviors will require enterprises to reimagine existing business models

    The accelerated shift to digital that was originally driven by the global pandemic has consumers expecting total digital freedom, with the ability to choose when, where and how they interact with brands across many industries.

    Even those who were slow to adopt digital are now on board — which means businesses must adapt, not just to meet today’s expectations but also to prepare for the changes tomorrow may bring. Analysts and experts agree that businesses must focus on customer-centricity — particularly industries that have lagged in moving to digital. And they can show that they care by focusing less on one-way transactions and more on two-way customer conversations that drive trust and loyalty, and provide value. 

    2. Conversational experiences will make or break brand loyalty and customer trust

    Consumers and businesses alike are overwhelmed with choice, making competition for attention and loyalty fiercer than ever. Add ongoing instability to the equation, and cultivating trust becomes the key to fostering lasting customer relationships.

    Earning customer trust is especially challenging for industries that deal with emotionally-charged matters — such as money, health, and property loss or damage. Businesses addressing these needs should cultivate a tech ecosystem that’s interconnected and interoperable, pulling together data and processes from multiple systems of record to create easy, efficient conversations that are both sophisticated and seamless. 

    3. Enterprises will automate and digitize key business processes to increase operational efficiency

    The pandemic-accelerated pace of digital transformation has led to an IT skills shortage that’s being felt globally. And many businesses are looking to low-code solutions to reduce the burden on IT and increase operational efficiency by empowering non-technical business users.

    Shifting the mindset away from maintenance paves the path for future success by freeing IT teams from routine and repetitive tasks, allowing them to focus on more strategic initiatives. Cloud-based solutions also reduce total cost of ownership (TCO) and technical debt while bringing much needed resilience. Cultivating a tech ecosystem that brings agility and flexibility at scale will be critical to increasing operational efficiency without impacting customer experience. 

    4. Enterprises will mitigate risks and protect brand reputation by increasing the focus on compliance and regulatory requirements

    Continuing cyberthreats are creating an increased need for business leaders to focus on compliance and regulatory requirements, which are constantly evolving — particularly for highly-regulated industries such as financial services, healthcare and insurance.

    Adopting a cloud-first approach will enable highly-regulated organizations to greatly reduce risks and keep up with ever-changing regulatory requirements — which will continue to evolve in 2023 and beyond. Investing in the right tech partners enables deep visibility into the nuanced requirements of each industry, with the ability to easily make sweeping updates as the rules of engagement change. Layering on automated, digitized solutions helps to ensure communications are compliant across all customer touchpoints; legacy systems simply aren’t up to the task. 

    5. Technological innovation will remain a top priority as enterprises recognize the increased need for agility and scalability

    Business leaders know that speed and scale are mission critical. As global markets become more interconnected and waves of change continue to rise, enterprises must be able to adapt on the fly — and at massive scale. This calls for replacing legacy systems and processes with sophisticated, cloud-first solutions that enable data interconnectivity, operational efficiency and enterprise-wide flexibility.

    As customer expectations continue to evolve, businesses need to be able to access and act on customer data and deliver personalized, unique customer interactions at every touchpoint. 

    We’d love to hear your thoughts — so please send us an email!

    Read More

    Ashu Garg: 3 Takeaways from the Generative AI “Unconference”

    By Article

    As General Partner at Foundation Capital, Ashu Garb collaborates with startups throughout the enterprise stack. His career is reflective of his enthusiasm for machine learning and revolutionizing established software domains to create fresh consumer interactions.

    While FC’s inaugural Generative AI “Unconference” was held back in June, we still find ourselves referencing Ashu’s observations from the conference. We hope you take away as much from his highlights as we have.

    1. AI natives have key advantages over AI incumbents

    In AI, as in other technology waves, every aspiring founder (and investor!) wants to know: Will incumbents acquire innovation before startups can acquire distribution? Incumbents benefit from scale, distribution, and data; startups can counter with business model innovation, agility, and speed—which, with today’s supersonic pace of product evolution, may prove more strategic than ever.

    To win, startups will have to lean into their strength of quickly experimenting and shipping. Other strategies for startups include focusing on a specific vertical, building network effects, and bootstrapping data moats, which can deepen over time through product usage.

    2. In AI, the old rules of building software applications still apply

    How can builders add value around foundation models? Does the value lie in domain-specific data and customizations? Does it accrue through the product experience and serving logic built around the model? Are there other insertion points that founders should consider?

    While foundation models will likely commoditize in the future, for now, model choice matters. From there, an AI product’s value depends on the architecture that developers build around that model. This includes technical decisions like prompts (including how their outputs are chained to both each other and external systems and tools), embeddings and their storage and retrieval mechanisms, context window management, and intuitive UX design that guides users in their product journeys.

    3. Small is the new big

    Bigger models and more data have long been the go-to ingredients for advancements in AI. Yet, as our second keynote speaker, Sean Lie, Founder and Chief Hardware Architect at Cerebras, relayed, we’re nearing a point of diminishing returns for simply supersizing models. Beyond a certain threshold, more parameters do not necessarily equate to better performance. Giant models waste valuable computational resources, causing costs for training and use to skyrocket.

    To read Ashu’s full report, and his Top 5 Takeaways, click here.

    Read More

    M.R. Asks 3 Questions: Colin Campbell, Author

    By Article

    Roughly 20% of new businesses fail within the first year, and 50% are gone within five years

    So what makes a startup successful? Is it mainly a combination of hard work and luck, or is there a winning formula?

    Colin C. Campbell has been a serial entrepreneur for over 30 years. He has founded and scaled various internet companies that collectively have reached a valuation of almost $1 billion. In his new book, Start. Scale. Exit. Repeat.: Serial Entrepreneurs’ Secrets Revealed! Colin shares a wealth of experience, with an in-depth guide featuring interviews with industry experts and points readers in the right direction on their entrepreneurial journey to help answer the questions they’ll encounter.

    M.R. Rangaswami: What is it about what you share in Start. Scale. Exit. Repeat.: Serial Entrepreneurs’ Secrets Revealed! that you feel hasn’t been shared before?

    Colin Campbell: Start. Scale. Exit. Repeat. represents 30 years of my experience as a serial entrepreneur, a decade of research and writing, and over 200 interviews with experts, authors, and fellow serial entrepreneurs. The book deconstructs the stages of building a company from inception to exit, and lays out strategies to replicate this success repeatedly.

    At each stage of a company’s life cycle, it’s crucial to fine-tune your narrative, assemble the right team, secure adequate funding, and put in place effective systems. The strategies for achieving these vary dramatically, from the chaotic, founder-centric startup phase to the more structured approach needed to scale. As you near the finish line, your strategy will have to pivot once again.

    The core message of Start. Scale. Exit. Repeat. is that entrepreneurship isn’t a “one and done” affair. It’s a skill—akin to any other trade—that you can master and continually refine. There’s a recipe for launching a successful startup, and this book simplifies it into actionable steps to be taken one at a time.

    Furthermore, the book challenges the prevailing obsession with unicorns. We exist in a “unicorn culture,” where a valuation under a billion dollars is often frowned upon. But this mindset is perilous. The high-velocity chase for unicorn status has led to a wreckage of dreams and fortunes along the Silicon Valley highway. I’ve witnessed countless founders succumb to this “Silicon Valley disease,” sacrificing years of labor and significant capital.

    There’s a more pragmatic approach to building wealth, and it’s far simpler: start, scale, exit, take some money off the table, and repeat.

    M.R.: What was your biggest lesson from one of your biggest setbacks?

    Colin: Let’s take a trip down memory lane to the early ’90s. My brother and I launched an Internet Service Provider (ISP) in Canada. We were pioneers on the “Information Superhighway,” connecting hundreds of thousands of Canadians to the internet. We found ourselves in the whirlwind Geoffrey Moore famously described as the “Tornado.” It was an exhilarating ride, especially for a couple of 20-somethings who had grown up on a farm.

    We took the company public later in the ’90s and merged it with a wireless cable company, closing at a valuation of approximately $180 million. After receiving 50% of a wireless spectrum for fixed wireless internet from the Canadian government—yes, they handed out spectrum back then to encourage competition—our company’s valuation skyrocketed to over $1 billion. Technically, it was a stock-for-stock swap, with our shares being locked up for 18 months. At 28 years old in 1998, I owned almost 14% of the company.

    We thought we were invincible. The internet was poised to change everything, and we were on the forefront. 

    Then, out of nowhere, the .COM crash hit. 

    Our company pulled its secondary offering to raise $50 million because the Nasdaq had tanked to 4,000. And it kept falling, plummeting to 1,300 and not recovering for over a decade. It was indeed the .COM crash, and the music had stopped—without enough chairs to go around.

    Did we make mistakes? Absolutely. We shouldn’t have relinquished control without securing liquidity. “Liquidity or control” has since become our mantra for all future ventures. And let’s face it—stuff happens. Technologies evolve, regulations change, and market climates shift. That’s why it’s crucial to exit when times are good. When the party’s in full swing, make a discreet exit, take some money off the table, and focus on your next venture.

    As for that unicorn of ours? It filed for bankruptcy protection, and our stock plummeted from a high of $19 a share to the paltry sum I sold it for: 6 cents a share.

    Thankfully, we regrouped and stuck to our strengths. We launched Hostopia, a global leader in hosting and email solutions for telecoms. We took it public and eventually sold it to a Fortune 500 company—this time for an all-cash deal—just a month before the Lehman crisis in 2008.

    M.R.: In your experience, once a business is past the first 5 years of failing, what’s the next riskiest precipice they encounter?

    Colin: The vast majority of companies in America are small businesses, and most struggle to scale. But make no mistake—there’s a formula for scaling your enterprise. Some companies might find it more challenging than others, and some may opt out due to the stress and transformative changes that come with scaling.

    In the SaaS (Software as a Service) industry: if you’re not growing, you’re dying. After the .COM crash, we found ourselves running low on funds while operating our hosting and email platform. Still, we remained optimistic. Why? Because even though we were bleeding $500,000 per month, our customer base was growing. Growth is the lifeline in SaaS; losing money is acceptable as long as you’re expanding.

    Hostopia, for example, adhered to the Rule of 40, maintaining a growth rate plus profit margin that exceeded 40%. We achieved 32 consecutive quarters of growth, leading to an IPO and ultimately a successful sale at a 60% premium over our trading price to a Fortune 500 company. Another venture, .CLUB Domains, also operated in the red for several years. Nevertheless, we managed to cut losses by about half a million dollars annually until we started adding the same amount to our bottom line, culminating in an exit to GoDaddy Registry.

    Am I a genius entrepreneur? As much as I’d like to think so, that’s far from the truth. In 2005, our company was facing internal strife, stalled sales, and a board questioning my role as CEO. One board member even remarked, “He’s too young and way in over his head.” That’s when a friend introduced me to Patrick Thean, a coach at Rhythm Systems. Patrick taught us invaluable systems like goal setting, strategic planning, daily huddles, and KPI tracking. In addition, we partnered with other coaches to transform the organization from a tech-centric company to a sales driven organization. The ultimate effect of all of these changes: we tripled our size within a few years.

    Since then, we incorporated these systems along with countless other insights I’ve gathered from serial entrepreneurs, experts, and authors. We’ve encapsulated these stories and lessons in the book, laying out a clear roadmap for SaaS companies aiming to scale.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    A Quick Q&A with David Luke, Global Practice Leader at Consulting Solutions

    By Article

    Organizational Optimization is what gives David Luke’s career credibility.

    In this quick Q&A, David shares his insights on the major staffing and retention challenges tech leaders are facing and how IT teams can accelerate their approaches to innovation to stay competitive.

    M.R. Rangaswami: What kind of staffing and retention challenges are IT leaders facing right now?

    David Luke: IT leaders are experiencing a new phenomenon in today’s professionals: an influx of talent that is demanding to work in non-traditional ways. HR departments are finding it difficult to create a standard job class or role category. Executives and line managers alike are turning to firms like Consulting Solutions for a la carte solutions to address anti-patterns that are impeding their business.

    Here are what I believe to be the top five challenges in our current labor market:

    1. Creating a safe space for employees where they can land, grow, and learn while delivering both innovative and traditional pieces of work. By partnering with HR and recruiting firms, leaders can develop a place where folks want to work, are able to grow their career to the level that they desire, and develop their knowledge / skills with a defined path forward.
    2. Attracting people who are late career that bring knowledge and maturity to an organization. These are the gems in our workforce that can not only deliver with speed but also mentor new professionals in the workforce.
    3. The ability to balance a lower-cost delivery with a world-class product and retaining those people that deliver that product.
    4. The decision between remote and on-site, which means ensuring that you are getting the talent that will accelerate your business by offering options for your people. There is some exceptional talent out there who would love to work remotely, and then there are also folks who thrive in an in person collaborative environment. Leaders need to weigh how they want their workforce to be shaped and potentially develop a blend.
    1. Although it’s an attractive practice, leaders need to understand some of the limitations of nearshoring/offshoring their workforce—fewer overlapped hours, decreased team retention due to offshore labor practices, and collaboration on a limited basis. Ensure that you weigh the cost savings versus delivering an exceptional product.

    M.R.: Why is the “product owner role” so critical to delivery team success? 
     
    David Luke: Exceptional product owners use their superpowers to bring the product vision down to the team level. They focus relentlessly on prioritizing what is needed and what is wanted for their business, their stakeholders, and their customers. The best product owners can strike the right balance between being specific enough to provide clear direction to the team while still being flexible enough to accommodate changes and shifts in priorities that come from a deep and dynamic partnership with product managers.
     
    These proverbial unicorns also have a deep knowledge of user needs and the experience that the business wants the customer to receive. They easily see the bigger picture and engage often with product managers, customer experience, and user-experience experts to define and drive the delivery of great products. 
     
    Elite product owners have an abundance of empathy in their toolkits. They’re able to read the pulse of the team, the customer, and stakeholders while balancing the push and pull to deliver great products.
     
    What sets apart the truly outstanding product owners is the ability to effectively listen. Not just to the words but to the underlying messages and sentiments of everyone who they actively seek to communicate with as part of their rituals, ceremonies, and workdays. 
     
    Great product owners don’t just look inward; they excel at looking outward to the market, the competition, and the changing technologies that they work with every day. They know the goals and challenges and can articulate the path forward to lead their teams and their products to successful outcomes. They are storytellers, evangelists, and cheerleaders for their teams and their products. The word on the chest of their superhero suit is often “TEAM”. 
     


    M.R.: If technology is evolving faster than workplace structures can keep up, what must IT teams do to accelerate their approach to stay competitive and deliver results? 
     
    David: At the heart of any change to approach, regardless of its scope, lies the critical support of leadership. While grassroots efforts can certainly achieve success, a unified message and commitment from the top sets the tone for the entire organization.

    To ensure an accelerated approach, it is also essential to establish governance and a defined way of working, while remaining open to adjusting these as you gain a deeper understanding of your company’s culture. With these foundational elements in place, you can then develop charters and set clear, measurable objectives and key results (OKRs) to guide your progress toward success. And most importantly, START THE WORK! Don’t get bogged down in planning—act and stay focused on delivering results. 
     

    Once you have established a new, accelerated way of working, you must set about to streamline your efforts and prioritize the things that are most important to your customers. Use your product owners, UX experts, and CX experts to gain the trust and pulse of your customers, as they are who you are building for, and they will tell you if you are getting it right. Leverage new practices such as design thinking to understand who you are building for, what their pains are, and how you can deliver products to eliminate or alleviate those pains.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    Navigating M&A in Uncertain Markets: 2H 2023 Update

    By Reports

    The question of “are the market conditions right” remains in the minds of investors and executives interested in exploring M&A. We address this question by sharing our perspectives on how to achieve a successful M&A outcome.

    Our recommendations are based on Allied Advisers’ deep experience in advising clients on their exit to both Fortune-500 and mid-market strategic buyers, as well as a diversity of PE funds. In the last 12 months, we advised clients on their exit to: Activision Blizzard King, Walmart, Dura Software, PSG Equity and Virtana among others.

    While 2010-2021 were robust years for M&A and capital raises for technology companies, the markets today have changed significantly in terms of deal volume and valuation though we are seeing improvements to a more rationale and sustainable market.

    With the major indices rebounding this year from the lows of 2022; the question of “are the market conditions right” still remains in the minds of investors and executives interested in exploring M&A.

    This article covers some of the M&A trends including that private equity (PE) continues to be major driver of deal volume, there have been new technology M&A buyers among larger private companies, and we are seeing stabilization of deal volume and value.

    Also, the impending IPO of Arm (Semiconductor), Klaviyo (Software) and Instacart (Internet) not only provide a litmus test about what private companies are worth in public markets but also create currency, potentially opening the door for them and a slew of other companies for future IPOs and M&A.

    We at Allied Advisers are also sharing our own observations and our perspectives on how to achieve a successful M&A outcome in the current environment. In the last 12 months, we advised clients on their exit to: Activision Blizzard King, the world’s largest game network and a Fortune 500 company; Walmart, a Fortune One company; Dura Software, a software consolidator; PSG Equity, a top tier PE fund ($22.1B AUM); and Virtana, a growing PE backed company among others.

    Below is the full report from Allied Advisers:

    Gaurav Bhasin is the Managing Director at Allied Advisers.

    Read More

    A Quick Q&A with Jonathan Tomek, Vice President of R&D at Digital Element

    By Article

    This conversation is ahead of Cyber Security month, and sharing what information is available for our network of tech leaders and the cyber security solutions available to them.

    Johnathan Tomek is a VP at Digital Element, a global IP geolocation and intelligence leader for over 20 years. There, he is a seasoned threat intelligence researcher with a background of network forensics, incident handling, malware analysis, and many other technology skills. Previously, Jonathan served as CEO of MadX LLC, Head of Threat Intelligence with White Ops, and Director of Threat Research with LookingGlass Cyber Solutions, Inc.

    In this Q&A Jonathan shares the challenges that many of the world’s largest websites, brands, security companies, ad networks, social media platforms and mobile publishers face–and the best practices his team takes to combat online fraud.

    M.R. Rangaswami: With the rise of VPNs and residential proxy IP networks, many corporate security teams seem to struggle to see who accessing their networks and data. How should they
    approach security as these trends accelerate?


    Jonathan Tomek: IP address intelligence data can help security teams hone their best practices for establishing rules for who can access their network. For instance, IP address data reveals a great deal about masked traffic, such as whether it is coming from a VPN, darknet or residential IP proxy. With this knowledge, security teams can opt to block all darknet traffic automatically.

    Likewise, knowing that many people use residential IP proxies to scrape websites for competitive research, security professionals can opt to block all residential IP proxies.

    The important factor here is context. A company may not be concerned about VPN traffic in general, but if thousands of failed login attempts from a specific VPN over a short time period are observed, this would be indicative of an individual threat versus many unknown attacks.

    Digital Element also knows a great deal about the VPN market, including which providers offer features that enable nefarious players to hide their activities.

    That insight can be used to set access policies based on the VPN provider. For instance, you may want, as a matter of policy, to block all traffic that stems from VPNs that are free, or accept crypto payment and allow no-logging behavior as an option, as they are features that allow bad actors to cover their tracks.

    Though some believe blocking is a common theme, the context provided can be more importan at times, especially after an incident by helping to understand characteristics of the threat and narrow down the area of focus.


    M.R. Requesting additional authentication is a safe, but costly, practice. How can IP address
    intelligence data help security teams drive efficiency in its access policy?

    Jonathan Tomek: Asking for additional authentication is a good security measure, but it does require additional computing power, which isn’t free. It also affects the user experience, especially when a loyal customer signs into a system frequently.

    IP address intelligence data is useful here, both in helping networks save resources, and ensuring a more seamless user experience. Such insights include IP stability, which tells us how long a specific IP address has been observed at a specific location.

    If a customer signs into your network every day via the same IP address observed at the same geolocation, there may be no need to request a second authentication. But if one day that user attempts to sign-in from an IP address from a geolocation on the other side of the country, or from a more local region but is also a VPN, it would be a good idea to validate them.

    IP address intelligence data can provide context to help security teams set policies that prioritize when to request additional authentication.

    M.R.: How can IP Intelligence data help security teams understand how a breach occurred, and
    to minimize any damage done?


    Jonathan Tomek: That’s a great question. Every security professional understands that, try as you might, it is simply impossible to prevent a breach.

    The best approach is to be able to respond quickly and minimize the impact in the event of a breach. IP address intelligence is critical to add to a security information and event management solution (SIEM).

    By leveraging IP intelligence, you have additional data points which can help reduce false positive alerts, while also refining other alerts for investigators.

    The ability to cluster events is a huge timesaver. If a specific VPN was used during a breach, you could find related IP addresses and see how the attacker was attempting to gain entry to your infrastructure, helping you with the timeline.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    SEG Snapshot: 2Q23’s SaaS M&A and Public Market Report

    By Reports

    Software Equity Group’s quarterly report is in, they’re revealing an improved outlook across the broader macroeconomy, industry excitement around AI, and overall investor optimism for growth businesses contributed to a solid first half for publicly traded B2B SaaS companies.

    Meanwhile, continued strategic buyer and private equity interest has resulted in strong M&A outcomes for high-quality SaaS businesses exhibiting capital efficient growth, strong retention, and product differentiation. 

    Here are five highlights from the report:

    1. Aggregate Software Industry M&A deal volume has seen strong momentum in recent quarters, reaching 897 total deals in 2Q23 and up 5% from 855 deals in 1Q23

    2. Deal activity for SaaS M&A remains high relative to historical periods (538 in 2Q23). Although deal volume in 2Q23 experienced a 5% decrease over the prior quarter, SaaS M&A is on pace for the second-highest annual total in the last ten years (only eclipsed by the bubble year of 2022). The month of May saw 192 M&A deals, the second-highest monthly deal volume for SaaS in ten months.

    3. The average EV/TTM Revenue multiple for 2Q23 was 5.6x. However, specific cohorts within SaaS are continuing to sell for premium multiples. Strong outcomes are being had for companies fitting the profile from a SaaS KPI (capital efficient growth, strong retention, etc.) and product differentiation standpoint.

    4. Vertical SaaS comprised 46% of all M&A deals in 2Q23. Financial Services jumped up to the pole position of the verticals, representing 18.9% of all SaaS deals.

    5. Private equity appetite for SaaS M&A remains high as it represented the majority (61.3%) of deals in 2Q23. PE-backed strategics represented 52.4% of deals, and PE platform investments were 8.9%.

    Download the full report from Software Equity Group, here:

    Read More

    M.R. Asks 3 Questions: Evan Huck, Co-Founder & CEO of UserEvidence

    By Article

    With the decline of trust among B2B buyers because of vendor over-promising, economic pressures, and shifting expectations, CEO Evan Huck (and his co-founder Ray Rhoades) have been evaluating the evolution of social proof in the buying journey.

    Pulling from his experiences working at TechValidate and SurveyMonkey, Evan was inspired to create a company that could help businesses quickly and efficiently capture customer feedback and — leveraging the power of AI — automatically create on-brand content at scale, removing a significant source of friction from modern go-to-market teams’ sales motions.

    M.R. Rangaswami: Trust is at an all-time low for B2B buyers. What’s causing this and why does it matter?

    Evan Huck: B2B buyers are becoming increasingly skeptical of vendor marketing hype after repeatedly being burned by sales teams over promising and under delivering. Economic pressures have placed increased scrutiny on every tech purchase, upping the ante on the importance of making the right purchase the first time. Additionally, a recent Gallup poll found that greater access to information, lack of company focus on the customer lifecycle, and shifting expectations from a younger generation of buyers are all contributing factors to the breakdown of trust between vendors and buyers. As a result, peer recommendations and social proof are emerging as critical factors in the B2B buying journey.

    Why this matters? Vendors are no longer in control over the buyer journey, and they get less direct interaction with the prospect. Buyers expect to see relevant customer examples validated by real-world data before making large technology purchases. To rebuild trust with buyers, vendors need more than a handful of curated customer success stories – they need a library of authentic and relevant customer proof points that prove the product’s value across different use cases, company sizes, and industries. 

    M.R.: More than ever before, B2B buyers now look to their peers, not vendors, when making buying decisions. How is UserEvidence helping B2B software companies use customer feedback to address this new reality? 

    Evan: Historically it has been very difficult to gather enough reliable customer stories – seeking out these proof points is often labor intensive, laden with approvals, and costly. In the past, companies typically have created their own content in-house or leaned on an outside agency for support in collecting and creating these assets. These solutions have left companies scrambling to fill in the gaps as buyers demand more real-world examples they can connect to.

    UserEvidence resolves these issues by providing one platform that all go-to-market functions can use to capture customer feedback and — through advanced generative AI capabilities — deliver unbiased customer stories and beautifully designed assets for companies to use in their sales initiatives. Long gone are the days of analyzing customer data manually; UserEvidence processes these datasets quickly so that go-to-market teams can start creating content that attracts buyers. Companies can now easily collect and create these customer stories at scale, taking control of their most valuable asset: real-world social proof.

    Another benefit of the UserEvidence platform is the ability to continuously capture feedback and sentiment from users and customers, at important junctures in the customer journey. Surveys are delivered at key moments throughout the customer lifecycle, creating a continuous stream of learnings and insights that drives good decision making.

    M.R.: Getting feedback from actual customers helps not only B2B buyers, but every internal function across GTM teams. How does UserEvidence plan to bridge this gap?

    Evan: Every function in a B2B company — from the functions that sell a product (product marketing, sales enablement, customer marketing, and customer success), to the functions that build the product (product, product management, strategy) — should be guided by the voice of the customer and customer feedback.

    The problem is each function’s efforts to capture feedback are siloed, and the learnings from each effort aren’t shared between functions. Positive stories from a product management survey never make it into the hands of a sales team. Negative feedback from a marketing team’s efforts to find users willing to do case studies never makes it to product management or customer success.

    UserEvidence helps unify feedback collection efforts across functions, and helps each function take action on that feedback. Marketing can create on-brand sales and marketing assets, while product management can get insights on how to make the product experience better. Several goals are accomplished with one touch to the customer making for a more elegant customer experience.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    Quick answers to Quick Questions: Ivan Houlihan, SVP & Head of West Coast U.S for IDA Ireland

    By Article

    A slightly different conversation this week as we speak to Ivan Houlihan, Senior Vice President and Head of the West Coast of the United States for IDA Ireland–the Investment and Development Agency of the Irish Government, which promotes foreign direct investment into Ireland. 

    Based in California, Ivan leads the team that works closely with existing and potential clients in technology, financial services, life sciences and engineering throughout the Western US and Mexico. 

    We hope you enjoy this week’s angle about on Cybersecurity, cyber skills and microcredentials.

    M.R. Rangaswami: How Do Microcredentials Address the Cybersecurity Talent Scarcity Problem?

    Ivan Houlihan: While nations pass resolutions and laws that try to prevent cybercrime, the most widespread answer is increasing the supply of expert security talent to stay ahead of the criminals. 

    Ivan Houlihan suggests an innovative approach, which involves the concept of microcredentials. These are small, accredited courses that allow candidates to pursue highly focused upskilling and reskilling that responds to specific market needs. Besides creating qualified new candidates to quickly come on board, this solution opens the door to workers that might otherwise not have pursued careers in cybersecurity.

    As the head of the West Coast U.S. for IDA Ireland, Houlihan has seen an increasing number of American technology firms with operations in Ireland employ this strategy to address their cybersecurity talent crunch. 

    When it comes to microcredentials in cybersecurity, Houlihan believes that Ireland’s innovative training programs can become a model for other nations seeking to address the serious issue of cybercrime, which is predicted to cost the world $10.5 trillion by 2025. In this quick Q&A, he explains the basics of setting up a microcredentials program in the cybersecurity space – although microcredentials can be earned in other technical areas, too.

    M.R. Rangaswami: What are some of the current issues impacting cybersecurity staffing and why are microcredential programs a reasonable solution? 

    Ivan: Technology workers in general are often in short supply, but when it comes to qualified cybersecurity personnel, the problem is compounded by educational requirements along with needed specific skills that take time and money for those seeking to enter this field. Technical degrees, specialized training and often, often some graduate work, have discouraged many would-be candidates, particularly those put off by the prospect of student loans and related barriers. One of the biggest myths in the cybersecurity field is that it’s just for people with high proficiency in math, men only or those with certain graduate degrees. People also assume they must go to renowned universities to study for the field in order to pursue such careers. All these factors have conspired to decrease the pool of qualified candidates.

    Microcredential programs short-circuit the time and cost of pursuing a lucrative cybersecurity career, although the field does require some technical training as a starting point. Fortunately, being male, having graduate degrees and other assumptions don’t apply, however. Microcredentials bring down the cost and time commitments while increasing cybersecurity job opportunities for women, military veterans, minority groups, people from financially disadvantaged backgrounds, workers from other departments and others previously not often found in the profession. And since microcredential programs are typically online, they are of short duration and can be “stacked” or combined to form bigger accreditations – this makes it easier to get the right kind of training for a promising new career. The most successful microcredential programs demonstrate a collaborative effort between universities, governments, research institutions and industry, with the latter providing curriculum input based on what the candidates need to know to hit the ground running.

    M.R.: Describe the cybersecurity microcredential programs you’re aware of, how they operate and the results so far.

    Ivan: Technology workers in general are often in short supply, but when it comes to qualified cybersecurity personnel, the problem is compounded by educational requirements along with needed specific skills that take time and money for those seeking to enter this field. Technical degrees, specialized training and often, often some graduate work, have discouraged many would-be candidates, particularly those put off by the prospect of student loans and related barriers. One of the biggest myths in the cybersecurity field is that it’s just for people with high proficiency in math, men only or those with certain graduate degrees. People also assume they must go to renowned universities to study for the field in order to pursue such careers. All these factors have conspired to decrease the pool of qualified candidates.

    It’s encouraging to say Ireland has been ahead of other nations in its efforts to increase the supply of cybersecurity talent. Last year, the International Information System Security Certification Consortium, or (ISC)², the world’s largest IT security organization, released a report that found Ireland closed its cybersecurity skills gap by 19.5% while the global gap grew by 26.2%. Through a government grant in 2020, Ireland created Europe’s first microcredential program, called CyberSkills, a collaboration between national agencies, industry and three leading Irish universities led by Donna O’Shea, Chair of CyberSecurity, MTU

    Sign up and instruction are online and in addition to 30 carefully designed microcredentials that learners can take as standalone pieces of learning or integrated into predesigned academic pathways, the program utilizes what’s called the “cyber range,” a unique, cloud-based, secure sandboxed area that simulates real-world scenarios and environments where students can test their new skills. 

    In talking to O’Shea, she told us that CyberSkills has already trained hundreds of people – and the program is expanding. She believes that the simple but effective collaboration concept of this program could be duplicated by other nations wishing to accelerate and expand their supply of cyber talent. The key, underlying concept of CyberSkills is that the training is totally focused on graduates being able to walk into jobs immediately and have the knowledge they need to be effective. 

    At a higher level, everyone should look at these microcredential programs as a major innovation in workforce development and lifelong learning. Being largely co-designed by industry makes them relevant and effective while their ease of use and low cost create new avenues for skills development long into the future. 

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Sahir Ali, Founder of Modi Ventures

    By Article

    Dr. Sahir Ali is a technology and healthcare leader, investor and board advisor who has extensive experience in the areas of artificial intelligence, medical imaging, cancer research, enterprise technology and cloud computing. He has advised and led some of the fortune 500 companies, hedge funds and other organizations in implementing and integrating cloud technologies and artificial intelligence/data science.

    As founder of Modi Ventures, a private investment firm focused on investing in venture capital funds and early-stage startup in disruptive and emerging AI and Medical technology applications, we thought Sahir’s insights into the current investing trends of healthcare AI and TechBio would be insightful. 

    M.R. Rangaswami: What types of tech bio and healthcare AI investments are gaining funding in the current economic climate?

    Sahir Ali: Some of the most exciting breakthroughs in medicine today are happening at the nexus of biology and computer science, using tools such as artificial intelligence (AI).  There are two major tech-enabled bio investments themes: therapeutic platforms (drug discovery companies based on novel platform technology) and transformative technologies (companies developing applications of breakthrough technological advances such as genomics and digital health). 

    M.R.: What advice do you have for emerging startups to succeed in the crowded healthcare technology market? 

    Sahir: Startups that focus on platform technologies that can yield multiple programs and shots on goal, instead of individual assets with binary outcomes tend to be very attractive from investment perspectives, as well as time to market valuation proposition. We also encourage our founders to establish high-quality partnerships across the ecosystem — true platforms produce many more assets than any individual company can develop.

    The healthcare industry is slow to adopt new technology, so startups need to market their product effectively to reach the target audience, especially for digital health and consumer  products. 

    M.R.: What areas of investing in healthcare AI are gaining the most traction in this economy? 

    Sahir: There is a great deal of traction (funding and support) for companies that combine AI technology to generate novel candidates and strong drug development expertise to validate and find the best potential drugs. Another key area is gene therapy, which offers the potential to cure—not just treat the symptoms of—many major diseases. Some of the most transformative technologies are major new applications of genomics. Next-generation sequencing has outpaced even the fabled Moore’s Law, as the cost and information content of sequencing has improved even faster than the cost and information content of computer chips.

    Companies which incorporate nextgen sequencing into diagnostic applications can enable better clinical outcomes at radically reduced costs. When cancer is detected late, only 20% of patients survive for five years, but when detected early, 80% survive for five years. Early detection saves lives and billions of dollars per year in medical costs.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    Act Small: The Key to Growing Durable Companies & Communities with M.R. Rangaswami

    By Uncategorized

    For those of you who have followed M.R. and his illustrious career, you may know a little about his resume from four decades in Silicon Valley.

    However,  in M.R’s interview with DataStax Chairman and CEO, Chet Kapoor,  they both offer stories, humour, reflections and lessons that take us beyond their LinkedIn profiles and into the minds of some of our industry’s great builders. 

    We hope you enjoy this light-hearted conversation on your next commute.

    M.R Rangaswami is the Co-Founder of Sandhill.com (the domain he bought for $20 in 1997)

    Read More

    M.R. Asks 3 Questions: Rahul Ponnala, Co-Founder & CEO of Granica

    By Article

    Rahul Ponnala is the co-founder and CEO of Granica — the world’s first AI efficiency platform — which is on a mission to make AI affordable, accessible and safe to use.

    He previously served as Director of Storage and Integrations at Pure Storage, where he engineered and integrated large-scale databases and file storage systems powered by all-flash technology. As a governing board member of The FinOps Foundation under The Linux Foundation, he helps shape the future of cloud financial management. A multidisciplinary academic, Rahul’s research spans mathematics, information theory, machine learning and distributed systems. He holds a portfolio of patents in computational statistics and data compression.


    M.R. Rangaswami: What are the hard business and/or technology problems that inspired you to found Granica?

    Rahul Ponnala: Advancements in deep learning have been powered by ever-larger models processing ever-growing amounts of data. The performance output of an AI algorithm is primarily determined by the diversity and volume of data it can access. So, as AI becomes integral to products and services in nearly every domain, access to “high quality” data will become both a critical necessity and a fundamental constraint, ultimately dictating the pace and effectiveness of AI investments at enterprises.

    To derive “high quality” data, enterprises must extract the maximum amount of information from their data stores and thereby maximize the value of their data – but the challenge here is two-fold. As data volume grows, so do the costs of managing, processing and storing it in the cloud.

    Second, as the potential for insight from new data sources increases, the risk of misuse and mishandling increases. Enterprises who can successfully contain rising cloud costs associated with growing data stores, while ensuring the safe use of data in AI to preserve its analytical value, will develop formidable, competitive moats.

    Since its inception, Granica has been developing cutting-edge and efficient solutions to allow enterprises to maximize the value of their data – our AI efficiency services are no exception. We are witnessing a Cambrian-like explosion in the pace of deployment of AI into various apps, products and services, marking a major technological shift in the future of computing. And while there has been meaningful progress on the computing infrastructure and algorithmic layers of AI and ML, there has been little progress in increasing the signal-to-noise ratio of the data fueling these algorithms.

    This is a very difficult problem, involving deep information and computer science developments, combined with large-scale systems engineering – and this is precisely the problem Granica is focused on solving.

    M.R.: How will your AI efficiency platform impact the future of enterprise AI/ML adoption? What is your advice to organizations that want to adopt a more efficient and productive cloud data architecture for their AI initiatives?

    Rahul: Extracting the maximum amount of information from data stores is perhaps the most critical
    element in the long-term success (or lack thereof) of an organization’s AI investments and strategy. So by delivering a platform capable of helping organizations do just that, Granica is democratizing access to AI by directly making AI more affordable, more accessible and safe to use.

    By now, most organizations have grasped the importance and criticality of integrating an AI strategy into their corporate planning-and in fact, this was the most popular question Wall Street analysts asked the management teams of big tech companies this past earnings cycle.

    Yet, most organizations – large and small – are left hamstrung in determining where to start and how to do so in an efficient manner, while operating under a set of both economic and time constraints imposed by the market.

    When speaking with customers about AI, the number one question that comes up is: “How can I get started and where should I get started?” And our answer, non-surprisingly, is: “Let’s first evaluate the effectiveness and efficiency of your organization’s data strategy.”

    By getting plugged into a customer’s environment and providing deep, informative analytics with respect to their cloud data stores and how their data is being used, we are able to provide direct visibility and insight into the inefficiencies present in that customer’s data architecture and gain a deep understanding of that customer’s data and workload characteristics.

    This then allows Granica to quickly configure and tailor our platform to their environment and thus accelerate the time to value for the customer. By providing customers with efficient building blocks and tools for their data architecture and AI-powered applications, we can help them optimize their data access, storage and compute resources and thus maximize the value of their data.

    M.R.: You’ve expressed that people are integral to your company. What are your values/philosophies as a leader with respect to growing successful teams?

    This not only allows us to bring our best professional selves to the office but also build long-term friendships and trust with one another. We want each of our employees to feel comfortable turning to one another to seek guidance, help and coaching – not just about “work”, but also about personal circumstances.

    At Granica, our employees, or “ninja warriors” as we like to call them, are the backbone of our organization. We share successes as a team, we make mistakes as a team and we challenge each other.

    By doing so, we leverage the collective intelligence of the whole to put everything we can into delivering exceptional experiences for our customers and inspiring one another along the way.

    Everyone at Granica lives by the motto of “Whatever it Takes” and we actually have this signage up on our wall in the lobby of our headquarters. It doesn’t matter whether you’re an individual contributor or manager at Granica – we want everyone to be leaders and we want to provide the resources, mentorship and growth opportunities to allow each ninja to grow their careers to new heights.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    Allied Advisers Sector Update onAutomation Software

    By Article

    Allied Advisers has published their sector update on Automation Software which provides an overview of this important segment, recent exciting trends, the transactional market and active acquirers and investors in the ecosystem.

    Automation technologies are becoming increasingly pervasive across industries driven by the clear opportunity to achieve great improvements in productivity, process efficiencies and reduce human errors. Tailwinds to this sector have been strengthened by rising costs of labor and operations in an inflationary environment, and innovations in enabling technologies such as AI/Machine Learning, IIoT and Cloud.

    The increasing adoption of automation will necessitate further investment into developing technology skills. It is expected that many manually repetitive and low-skill jobs will be replaced by automation technologies, leading to higher unemployment in the economy. On the positive side, automation also opens up the opportunity for workers to be freed up from mundane tasks. Workers who retool and elevate their skill sets for the new world will be able to use their time more effectively and work well with machines to their benefit.

    Download their full report here:

    Gaurav Bhasin is the Managing Director of Allied Advisers

    Read More

    Quick Answers to Quick Questions: Aidan McArdle, VP of Technology, Cirrus Data

    By Article

    Aidan McArdle serves as the VP of Technology for Cirrus Data, a leader in block data mobility technology and services. Prior to joining Cirrus Data, Aidan worked at Hewlett Packard Enterprises (HPE) for 17 years, focusing on enterprise storage, servers and operating systems.

    In his role at Cirrus Data, Aidan leads a global team to solve complex problems with great technology, develops global services programs, and leads all aspects of pre-sales, product development, and partner management for major initiatives. Aidan also serves as EMEA Partner Enablement Director, helping partners and customers deliver success with their software.

    M.R.: What is the most important cloud trend today and what makes it so important?

    Aidan McArdle: Top of mind for organizations continues to be cloud adoption, but there is also a strong focus on FinOps or to put it simply – cost optimization, governance, and control. The IT landscape has been awash with layoffs for more than a year now and every enterprise is tightening purse strings as operating expenses (OPEX) comes under increased scrutiny from those paying the public cloud bills. 

    When storage was largely on-premises, production environments were almost always overprovisioned. It was all capital expenditures that were planned well in advance, and it wasn’t uncommon to have 30-40% utilization. In the cloud, the costs are monthly, and any capacity wasted is hitting their OPEX budgets. Cost control and optimization have become the norm for enterprises, which are striving to find more cost-effective ways to deliver their desired level of performance, reliability, and security.

    M.R.: How is cloud computing today impacting CIOs and their enterprises?

    Aiden: How to best benefit from the cloud will be (or at least should be) at the top of each CIO’s goals for 2023. It’s very hard to find an enterprise that has not seen a fallout from the post COVID slow down.

    The race to the cloud and the need to accelerate digital transformation has delivered many lessons in the last three years. In the rush to flexibly scale and deliver agile applications, many created straightforward ‘Lift and Shift’ plans. The idea being the organization would be able to take the database or application running on-premises and move it to the cloud themselves with little effort. What we’ve seen is for those organizations that managed to get pieces of their workloads into the cloud themselves, they are struggling with huge cost overruns. Other organizations are stuck in delays trying to determine the best path forward.

    With a renewed focus on optimization, control, and governance, we will see a positive impact. Costs should be controlled and likely reduced while teams gain a focus on the value of FinOps. 

    I‘ve had a number of really interesting conversations with businesses about the cost of cloud, repatriation and the shift back to on-premise. We have helped some organizations repatriate their workloads as they realize that for their environment using on-premises or a hybrid cloud strategy is ideal. And for others we have found they can meet their goals without a lot of post migration pain by analyzing their workloads and optimizing ahead of moving them to the cloud.

    The focus and thought process has sparked several interesting debates at management meetings this year and hopefully resulted in a plan to gain control over the cloud spend at many enterprises.

    M.R.: What else should organizations be thinking about when considering cloud best practices?

    Aiden: I don’t believe any organization is too small to look at FinOps and cost optimization. The fundamentals can help set down best practices for organizations of all sizes. For companies that are evaluating a cloud strategy in 2023 or 2024, I always recommend including the migration as part of the strategic planning. Migration is often an afterthought, and this leads to challenges. When accurate planning is not in place to connect people, process, time, and budget to deliver on the intended outcomes you will always find problems, on the contrary though, when the migration is planned properly it is generally executed faster and with minimal impact to the business.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Dror Weiss, Founder & CEO of Tabnine

    By Article

    Founder and CEO of Tabnine, Dror Weiss and his team are the creators of the industry’s first AI-powered assistant for developers. As a generative AI technology veteran, he is on a mission to help developers and teams create better software faster. 

    In this quick conversation, Dror discusses how developers can leverage generative AI today, cover how open source is advancing the generative AI moment and share his thoughts on what’s to come. 

    M.R. Rangaswami: How can developers take advantage of generative AI technology today? What can they expect in terms of benefit?

    Dror Weiss: Software developers can leverage generative AI for code today, in fact Tabnine has around 8M installs from VS Code and the Jetbrains Marketplace. Developers will see the most immediate benefit if they are working on languages that have a large open source example set (JavaScript, Python, etc). However, the value of generative AI for code is likely even higher with esoteric languages and unique code that are currently in the domain of enterprises.

    Code completion numbers vary significantly (25-45%), but with detailed ROI studies our customers are seeing mid-teens to low twenties in actual productivity uplift.

    M.R.: How is open source helping to advance the generative AI movement? 

    Dror: At the moment, open source cannot compete on spending and building the largest of models (i.e. GPT4) because currently these cost hundreds of millions of dollars and pull in as much data as possible.

    However, we are already seeing strong evolution of open source to build smaller models that are built specially for use cases, such as code. We believe these specialized models are the way forward and have already significantly closed the difference with the largest models.

    Much like Linux became the default for operating systems, we expect that open source will do the same for AI.

    M.R.: What’s next for generative AI – for developers, the enterprise?

    Dror: For developers, we believe generative AI for code will continue to expand into areas such as testing, chat and custom models. As for the enterprise, they are pushing for secure and controlled solutions, indicating they are all in on generative AI. 

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Pranay Ahlawat, Partner & Associate, Boston Consulting Group

    By Article

    We’re long past being able to escape Generative AI as a weekly conversation topic. From keynotes at software company conferences, to investment themes for VC/PE investors, we cannot escape Generative AI as a conversation topic today

    We reached out to Partner and Associate Director at Boston Consulting Group, Pranay Ahlawat, after reading his article on Regenerative AI Trends That Really Matter. We were impressed and intrigued with how Pranay sees this topic from multiple angles – advising clients, advising investors and a practioner, and wanted to share his insights with our Sandhill’s executive network.

    Pranay’s focus on enterprise software and AI at BCG help him discern the hype from reality and understand the true trends that really matter and what software companies, enterprises and investors must know about Generative AI.

    M.R. Rangaswami: We have certainly been in hype cycles in the past, what is different about Generative AI and why does it matter?  

    Pranay Ahlawat: Foundational models or the problem of natural language conversation isn’t new. Natural Language Processing, chatbot platforms and out-of-box text APIs from cloud vendors have been around for a decade today. Foundational models like Resnet-50 have been around since 2015. There are two things that are different about modern-day Generative AI. 

    First, modern language models or Large Language Models (LLMs) are architecturally different and have a significant performance advantage over traditional approaches like Recurrent Neural Networks and LSTM (Long Short-Term Memory). You will often hear the word transformers and “attention”, which simply put is the ability of the model to remember the context of the conversation more effectively. The quality of comprehension and ability to generate longer free-form text is unlike what we have seen in the past. 

    Second, these models have a killer app unlike any other and is immediately consumable by non-technical users. We have had transformative technology breakthroughs in the past – internet, mobile, virtualization and cloud, but nothing has come close to the astonishing rise of Chat GPT, which reached a hundred million users in about two months. This tangibility has added to the hype and despite the huge potential, a lot of the claims about Generative AI are unrealistic. 

    It matters because of the potential impact it has on society. We are a small step closer to general intelligence, we can potentially solve problems we weren’t able to solve before. It’s disruptive for many industries like media, education and personalization. Time will tell how quickly this will happen. 

    M.R.: What are the three things people must know about Generative AI today?

    Pranay: For me the three underlying principles or things you must know – (1) Generative AI is getting democratized, (2) the economics of Generative AI are a crucial vector of innovation and (3) The technology itself has limitations and risks. 

    First, the technology at the platform level is already democratized and the barriers to entry are continuing to go down. If you look at the commercial players – there are model vendors like Cohere and Antrhopic, platform vendors like Google, AWS and multiple other tooling and platform vendors e.g. IBM WatsonX, and Nvidia NeMo, all making it easier to build, test and deploy generative AI applications. There is real excitement in open source and community driven innovation at all layers e.g. frameworks like PyTorch, foundation models like Stable Diffusion and LLaMA, model aggregators like HuggingFace and libraries like Langchain. Today, a developer can create a generative AI application in a matter of hours, and a lot of complexity is abstracted away because of modern tooling. We have more than five hundred generative AI startups already, and the barriers to entry are continuing to come down.  

    Second, winners will know how to get the economics right. These models are incredibly expensive to train, tune and run inference on. A 300B parameter model costs anywhere from 2-5M in compute costs to train, and models like GPT-3 costs 1-5 cents per query. To give you an intuition – if Google ran a modern large LLM like GPT-4 for all search queries – it will see profits go down by roughly 10B. So, understanding the task and architecting for the right price/performance is an imperative. There is a ton of innovation and focus on cost engineering today – from semiconductors to newer model architectures and training and inferencing techniques that are focused on getting this price/performance balance right. 

    Third, there are well documented risks that are still not fully understood. The problem of bias and hallucinations is well documented, there are also unknown cybersecurity risks copyright and IP issues that enterprises need to worry about. Lastly, these models are only as good as the data used to train them, and they make mistakes – Google Bard’s infamous factual error on debut is a good reminder that AI is neither artificial, nor intelligent. 

    M.R.: Where are we in the adoption curve of Generative AI and where do you believe this is all going?

    Pranay: We are still early innings here. We are seeing a ton of enterprises experiment and run pilots and POCs, but almost no adoption at scale. There are certain use cases like Marketing, Customer Support and Product Development that are more ready and have out-of-box tooling e.g. Jasper and GitHub CoPilot etc. The reported performance gains vary significantly, however. There are many numbers, even from reputable sources which are conjecture without any tangible evidence. Companies should evaluate these tools and assess impact before building business cases. 

    I believe the adoption in the enterprise will be slower than most estimates. Many underlying reasons for that – lack of a strategy and clear business case, lack of talent, lack of curated data, unknown technology risks etc. The biggest challenge is that of change management – according to BCGs famous 70:20:10 framework, 70% of the investments in adopting AI at scale is tied to changing business processes vs. 20 in broader technology and only 10% in algorithms. These physics will remain the same. 

    We must also acknowledge that the generative AI itself isn’t a silver bullet and we are the very top of the hype cycle. Get your popcorn, the movie has just begun!     

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Molham Aref, CEO of RelationalAI

    By Article

    AI is the conversation we can’t get away from, so we’re doing our best to bring you as many perspectives, experts and insights into how enterprises are adapting, incorporating and utilising its rapid advancements.

    Molham Aref is CEO of RelationalAI, an organisation building intelligence into the core of the modern data stack. He’s had a more than 15 year career in AI where he has been investigating and implementing how knowledge graphs and covers benefits the build of intelligent data applications.

    M.R.: Generally speaking, how do you see AI advancing enterprise?

    Molham Aref: AI is an expansive concept that encompasses a wide range of predictive and analytical technologies. Gartner coined the term Composite AI to reflect the fact that AI in the enterprise is combining these technologies to help build intelligence into organizations’ decision making and applications. AI provides great opportunities to drive smarter and more insightful outcomes.

    Using AI, organizations can improve their decision making and achieve more reliable outcomes. The emergence of large language models (LLMs) has driven AI to an inflection point that requires a combination of techniques to generate results that cannot be achieved by point solutions.

    By leveraging AI, organisations can make accurate forecasts, anticipate customer behavior, and optimize resource allocation. This allows them to proactively address challenges, identify opportunities, and ultimately become more profitable. 

    M.R.: How are you incorporating knowledge graphs working with AI and Enterprise?

    Molham: Knowledge graphs were pioneered by technology giants like Google early on to improve search results and LinkedIn to understand connections between people. The technology models business concepts, the relationships between them, and an organization’s operational rules.  

    Specifically, a knowledge graph organizes data designed to be human-readable, augmenting it with knowledge about the enterprise in a way that allows organizations to  take their data, reason over it, and create inferences with the goal of making better decisions. This can be done in a variety of ways, including with graph analytics, which focuses on connections in the data.

    Organizations can augment their predictive models with an understanding of the relationships that exist between their data, for example, inventory and profit. These enhanced models enable organisations to arrive at decisions that make them more effective, more competitive, and more successful. 

    Knowledge graphs are proving to be one more tool in the toolbox that will significantly advance the enterprise.

    M.R.: What do you see the future benefits being for organisations who build intelligent data applications?

    Molham: Imagine a world where applications seamlessly adapt to your data, driven by intelligent capabilities. Where your applications can take action on your behalf, notify you to make important decisions, and dynamically make recommendations in response to sudden changes.

    Once organizations understand the potential impact of AI, they start to embrace technologies like knowledge graphs and data clouds. And with the modern AI stack complete, they can start building applications that let them automate workloads.

    With intelligent applications making the easy decisions, humans are freed up to work on the things that are more interesting and complex. Intelligent applications take the drudgery and tedium out of business operations, so that experts can focus more of their time and energy on decisions and tasks that will have a bigger impact, are harder to make, or require more human ingenuity than can be codified in software. 

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Dr. Alan Baratz, CEO of D-Wave

    By Article

    Dr. Alan Baratz’s career picked up momentum when he became the first president of JavaSoft at Sun Microsystems. He oversaw the growth and adoption of the Java platform from its infancy to a robust platform supporting mission-critical applications in nearly 80 percent of Fortune 1000 companies. It was that vast experience, amoung many, that brightly lit the path for Alan’s next role with D-Wave.

    First, as D-Wave’s Executive Vice President of R&D, Alan was the driving force behind behind the development, delivery, and support of all of D-Wave’s products, technologies, and applications. Now, spending the last three years as D-Wave’s CEO, Alan is building his expertise to hit a new stride and take his organization to the next level.

    M.R. Rangaswami : Can you provide an overview of D-Wave’s technology and the state of the quantum computing market today?

    Dr. Alan Baratz: It’s an incredibly exciting time in the quantum computing market, as we’re starting to see companies and governments around the world increasing both interest and investment in the
    technology. In fact, a study from Hyperion Research found that more than 80% of responding companies plan to increase quantum commitments in the next 2-3 years and one-third of those will spend more than $15 million annually on quantum computing efforts.

    The accelerated adoption of quantum computing comes at a time when businesses are facing difficult economic headwinds and are looking for solutions that help reduce costs, drive revenue and fuel operational effectiveness. Quantum’s power and potential to tackle computationally complex problems make it an important part of any modern enterprise’s tech stack.

    And the market potential is significant. According to Boston Consulting Group, quantum computing will create a total addressable market (TAM) of between $450–$850 billion in the next 15 to 30 years, reaching up to $5B in the next three to five years. Many problems, especially those relating to optimization, can be solved with today’s systems.

    There are two primary approaches to quantum computing – quantum annealing and gate
    model. While you may have heard that quantum computing won’t be ready for years, that longer timeline refers only to gate model.

    The reality is that practical quantum solutions, those that use quantum annealing systems, are already in market now, helping organizations solve some of their biggest challenges.

    D-Wave customers are using our Leap TM quantum cloud service to gain real-time access to our
    quantum computers and hybrid solvers to tackle some of their most complex optimization
    problems. We offer a full-stack quantum solution – hardware, software and professional
    services – to give customers support throughout their quantum journey. And given our QCaaS
    (quantum computing-as-a-service) approach, we make it very easy for the enterprise to
    incorporate the technology into their compute infrastructure.

    M.R.: What are some examples of commercial applications you’re seeing?

    Alan: Optimization is an enterprise-wide challenge that businesses of all kinds face – whether they’re in financial services, manufacturing, logistics, life sciences, retail or more. Many common yet computationally challenging problems like employee scheduling, offer allocation, e-commerce delivery, cargo logistics, and supply chain distribution can all be represented as optimization problems, and thus solved by today’s quantum annealing technology. These problems are made more difficult by the vast amount of data generated daily, which can quickly translate into critical pain points that impact a business’ bottom line.

    We’re seeing organizations increasingly turning to quantum-hybrid applications to address these optimization challenges. For example, the nation’s largest facility for handling shipborne cargo used D Wave technology to optimize port operations, resulting in a 60% increase in crane deliveries and a 12% reduction in turnaround time for trucks.

    A major credit card provider is using quantum-hybrid applications to optimize offer allocations for its customer loyalty and rewards program to increase cardholder satisfaction while maximizing campaign ROIs. And a defense company created a quantum-hybrid application for missile defense that was able to consider 67 million different scenarios to find a solution in approximately 13 seconds.

    The commercial value is apparent, and if you’re not currently exploring quantum in your enterprise, I believe you’re already behind.


    M.R.: What’s next for quantum computing?

    Alan: The pace of innovation and progress in quantum computing is remarkable. From a commercial exploration and adoption perspective, I believe we’re going to see a major uptick in the near term, as more organizations recognize the technology’s potential and increase investments. Quantum has moved out of the lab and into the boardroom.

    It’s no longer just relegated to the R&D teams to play with, but rather has captured the attention of business decisionmakers faced with increasingly challenging and complex problems that require faster time-to-solution. With the increased adoption will come rapid development of proofs-of-concept and ultimately production applications that will help streamline daily enterprise operations.


    From a scientific view, I expect major developments on the horizon as quantum annealing technology further scales and reaches even higher qubit counts and coherence times. Gate-model development will continue to progress, as the industry hopes to eventually find a path toward low-noise systems that can actually solve problems. Lastly, we all will continue our efforts to demonstrate quantum’s advantage over classical compute for intractable problems.

    We’re already seeing positive signs at D-Wave, as recent research findings contribute to a
    growing body of research that may lead us to the first practical quantum supremacy result.

    Read More

    M.R. Asks 3 Questions: Riddhiman Das, Co-Founder & CEO, TripleBlind

    By Article

    On a recent tour of healthcare organizations across the nation, Riddhiman started closely evaluating how different organizations are securing their data and even more important, securely accessing/sharing data.

    From developing new drugs and medical devices to allocating scarce resources amidst supply chain issues, most advancement in healthcare hinges on having access to the right data. Moreover, some of the most sensitive and highly regulated data requires technology solutions that take all of that into account to solve this complex challenge.

    Riddhiman recognizes how the traditional solutions used to tackle data problems but has solutions on how the next wave of innovation can allow the healthcare industry to gain insights from health data while maintaining privacy.

    M.R. Rangaswami: Data is arguably the most critical driver of innovation in healthcare today. What trends is this driving and what are some key “amount of data” stats in healthcare?

    Riddhiman Das: I believe that data is the most critical driver of innovation in healthcare but there are limitations because the data is sensitive and as a result, regulated. Everything in healthcare hinges on having access to the right data: From developing new drugs and medical devices to allocating scarce resources amidst supply chain issues.

    It’s no secret that having continuous access to raw health data is invaluable— this fact is well established. However, recent advances in analytics, machine learning, and artificial intelligence have brought us to a tipping point where healthcare can no longer ignore the value of having access to data. 

    And get this, privacy and compliance concerns have trapped two Zettabytes of data in silos and removed $500B in value creation for healthcare organizations.

    M.R.: If we know healthcare has a data problem, how have we traditionally been trying to tackle it?

    Riddhiman: Historically, organizations have tried to get around limited access to data by using synthetic, abstracted, or pre-anonymized datasets, but that strategy just doesn’t cut it. The method tends to be expensive and can result in flawed insights if the data contains errors or is missing a key element –  that doesn’t really benefit anyone. 

    We need access to data to drive the next wave of innovation—people’s health and well-being depend on it. We can only achieve this if the data is kept private to maintain patient privacy and the intellectual property rights of healthcare companies and their industry partners. 

    Over the years, initiatives have emerged to address this. Everyone has heard of HIPAA, which was enacted to protect patients’ health information from disclosure without their consent or knowledge. It also features standards designed to improve efficiency in the healthcare industry. The less-talked-about Sentinel Initiative was created to monitor the safety of medical products via direct access to patients’ electronic health records. Despite legislation and initiatives to help with this problem, the challenge remains and will only become more amplified as health data grows in volume and complexity. 

    Organizations have been shooting themselves in the foot by relying on manually de-identifying, abstracting, or normalizing data to get the insights they need. It’s nearly impossible to obtain meaningful, accurate, real-time insights from health data in this manner. This outdated method is hardware dependent, poses potential risks for re-identification, offers only partial security, and generally only works on structured or specific types of data. 

    M.R.: What are some fresh solutions to data and data privacy in healthcare you have seen?

    Riddhiman: We’ve seen quite a few technology solutions developed in recent years that tackle this issue in a way that allows healthcare organizations the ability to gain insights from data and maintain privacy beyond what regulations require. 

    Privacy-enhancing technologies (PETs) were specifically designed to make gleaning insights from health data scalable, accurate, and secure: a true win-win. One PET we’re truly excited about? Federated analytics.

    Federated analytics improves upon prior PETs and keeps health data safe in three ways. First, the data is secured at its point of residence so that external parties cannot access it in any meaningful way. Second, the data is kept secure as parties collaborate to decrease the risk of interception. Finally, the data is secured during computation, reducing the risk of sensitive information extraction. Organizations can also track how the data is used to ensure it is only leveraged for its intended purpose.

    Federated analytics software lowers the risks associated with sharing health data by eliminating decryption and movement of raw data, while allowing privacy-intact computations to occur. Additionally, technology improvements driven by federated analytics minimize the computational load necessary to analyze data, which reduces hardware dependency and increases scalability.

    Other benefits include access to raw data beyond just structured data, including video, images, and voice data; more secure internal (across regulatory boundaries) collaboration and external (between organizations) collaboration; and a lower chance of non-compliance due to simplified, more cohesive contracting processes. 

    Federated analytics is driving healthcare towards the future. By safely scaling access to raw health data, organizations can optimize processes for clinical trials, develop and deploy groundbreaking AI algorithms, and bolster pharmacovigilance. Thanks to the development of federated analytics solutions, there is no longer a need to choose between gaining powerful insights that will shape the future of healthcare and keeping patient data private.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions, Slavik Markovich, Co-Founder & CEO, Descope

    By Article

    Does password authentication really work anymore?

    Descope Co-Founder and CEO, Slavik Markovich, has been watching the unravelling problem with traditional password authentication, such as user difficulties and security vulnerabilities, for years.

    As a solution, Descope is developing sound passwordless methods, such as magic links, one-time passwords, social login, authenticator apps, and biometric authentication, that are gaining traction due to the rise of open standards and support from major companies like Google, Apple, Microsoft, and Shopify.

    In this conversation, Slavik get’s straight into the user experience and the solutions we are seeing that work.


    M.R. Rangaswami: Why is passwordless authentication picking up steam? 

    Passwords also cause friction throughout the user journey, leading to churn and a negative user experience. No one wants the cognitive load of remembering unique 16-character passwords for every site or app they access, so they reuse passwords across sites which is a recipe for disaster when passwords get leaked.

    Passwordless methods such as magic links, social login, and authenticator apps have been around for a while. Notable apps like Medium and Slack already use passwordless login, while authenticator apps are used as a common second factor in MFA.

    However, the rise of open standards and mechanisms such as FIDO2, WebAuthn, and passkeys over the past few years have sent passwordless adoption into overdrive. There are a few reasons at play here:

    • Passkeys are based on biometrics, which users are familiar with since they already use fingerprint scanning and facial recognition to unlock their phone or other computing devices.
    • Passkeys are being adopted by Internet heavyweights such as Google, Apple, Microsoft, and Shopify, who are also taking steps to educate users about the benefits of these methods.

    M.R.: What are some examples of passwordless authentication techniques? 

    Slavik: Passwordless methods verify users through a combination of possession (what they have) and inherence (who they are) factors. These factors are typically harder to spoof and are more reliable indicators of a user’s identity than knowledge factors are.

    These examples include: 

    • Magic links, which are URLs with embedded tokens that – when clicked – enable users to log in without needing a password. These links are mostly delivered to the user’s email account, but can also be sent via SMS and other messaging services like WhatsApp.
    • One-time passwords / passcodes, which are dynamically generated sets of numbers or letters meant to grant users one-time access to an application. Unlike passwords, an OTP is not static and changes every time the user attempts login.
    • Social login, which authenticates users based on pre-established trust with an identity provider such as Google, Facebook, or GitHub. Using social login precludes users from creating another set of credentials – they can instead focus on strengthening the passwords they already have on their identity provider account.
    • Authenticator apps, which operate based on time-based one-time passwords (TOTP). A TOTP code is generated with an algorithm that uses a shared secret and the current time as inputs – this means the code changes at set time intervals, usually between 30 to 90 seconds.
    • Biometric authentication, which are physical or behavioral traits that are unique to an individual. Biometric authentication checks these traits to grant users access to applications. Popular biometric authentication techniques in use today include fingerprint scanning and facial recognition. Biometrics are used in passkeys authentication, which I covered in the previous answer.


    M.R.: How do you see this technology evolving over the next several years? 

    Slavik: I see the evolution of passwordless technologies mostly focusing on education and compatibility in the years to come. The key pillars will be:

    • User education: Companies and the industry at large need to continue educating end users about the benefits of passwordless methods and the pitfalls of passwords. There are still myths about passwordless methods like biometrics that are common (e.g. what if someone steals my biometrics?) that need to be addressed (e.g. your biometrics never leave your device).
    • Developer education: Standards and protocols such as OAuth, SAML, WebAuthn, and others that form the basis of authentication mechanisms are complex. It takes developers time to pore over these protocols and implement authentication in their apps. Developers need to be provided with tools and enablement that abstract away the complexity of these protocols and let them add passwordless methods to their apps without lots of added work.
    • Compatibility: Passkeys compatibility is a work in progress. Over the coming months and years, more apps, browsers, and operating systems need to support passkeys if a passwordless future is to become reality. 

    All three points above are interrelated. If user education and developer enablement continues improving, more entities will be incentivized to add passwordless support, and vice versa. 

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    Vertical SaaS vs. Horizontal SaaS

    By Article

    Horizontal SaaS vs. Vertical SaaS – Which flavor of SaaS do you prefer?

    Allied Advisers has updated their previously published Flavors of SaaS report where they include an analysis across a select group of companies comparing operational metrics across the two flavors of SaaS; Horizontal SaaS and Vertical SaaS.

    As advisors who have worked across both flavors, they’re sharing some interesting differences.

    While Horizontal SaaS companies generally have larger TAM, Vertical SaaS companies can be more capital efficient and have better operational metrics and capital efficiency, making them better suited for middle-market funds.

    While there are category leaders in Horizontal SaaS, there are also a lot of opportunities in building Vertical SaaS companies which can become leaders in their own sectors. In today’s environment where capital efficient growth is being keenly measured, Vertical SaaS companies offer compelling opportunities for investors and buyers.

    FOUR HIGHLIGHTS FROM THE REPORT:

    I. Many SaaS firms focus on Vertical SaaS models to target a specific niche, allowing them to better serve industry specific client demands and making them easier to market.

    II. Vertical SaaS has seen rapid growth of businesses with smaller but more focused TAM (as compared with Horizontal SaaS) and generally more capital efficient business models.

    III. The market downturn in 2022 and Covid impacted some Vertical SaaS markets but overall digital transformation continued to accelerate within industries, with standardized solutions not being sufficient to address vertical needs.

    IV. We see continued investor interest in Vertical SaaS due to high growth prospects supported by strong business fundamentals, along with generally better performance on multiple metrics than peer Horizontal SaaS companies.

    For the full Allied Advisors report, see below:

    Gaurav Bhasin is the Managing Director of Allied Advisors

    Read More

    M.R Asks 3 Questions: Peter Maier, SVP of SAP

    By Article

    With a new book on the market, Business as UNusual with SAP, we have been looking forward to talking with Vinnie Mirchandani and his two senior VP’s at SAP about the megatrends they’re seeing powerfully ripple across the industry.

    As Co-Author and SVP of Strategic Customer Engagements at SAP, Peter Maier, was great to speak with to elaborate on how megatrends are changing competitive playing fields and shaping best business practices.

    M.R. Rangaswami: What was the motivation for you and your co-author, Thomas Saueressig, to write the book, Business as UNusual with SAP?

    Peter Maier: In our customer conversations Thomas and I experience every day how megatrends are driving the business and technology agenda of our customers. We found it worthwhile to share their voice and perspective how leaders successfully navigate industry megatrends using the capabilities of our intelligent suite and our intelligent technologies. 

    There are a few simple but deep principles that drive SAP’s product and innovation strategy for our customers in their industries: we focus on our customers’ core business, because that’s where they drive revenue, competitive differentiation and strategic transformation of business models and business operations.

    Then we look at end-to-end processes that run along value chains and across industry and company boundaries (that’s why digital business networks are so important). And we use a business filter when we look at new digital technologies: which have the potential to transform our customers’ business?

    Artificial intelligence is a great example here, we believe there is huge business potential – but realizing this potential requires integrated end-to-end industry processes. So each megatrend can transform the business of our customers in their industries – and digital technologies are key enablers.

    M.R. In your opinion, what makes this period of time “unusual”?

    Peter: All consultants have been claiming for decades that the ongoing change requires customers to adjust their strategies and operations. However, the last three years have shown us how fundamentally and quickly our world can change and how important the ability to rapidly adapt to change has become. Multi-year corporate programs have been compressed into quarters, months, and weeks. Fundamental beliefs have gone out of the window. And we perceive a new open-mindedness of many leaders to try new things – to embrace the idea to run a “business as unusual”. So we think it makes sense to use this momentum and start customer engagements to discuss how megatrends can inspire new ways of doing new things. 

    Many people feel threatened by change. If you look into the root cause for this reaction, you’ll find that change is stressful if it outpaces your ability to adjust or even take advantage of it. This is a very good reason to build and run an organization so that it can easily (or at least better than their peers) cope with disruptive change. And this change comes from all directions, just look at the drivers like generative AI, sustainability, virtual reality, metaverse, geopolitical conflicts, or pandemics. “Prepping” for all eventualities is certainly not the answer, but building and running an intelligent, sustainable, resilient, and agile enterprise certainly is. And many companies and institutions look at SAP to find solutions for this transformation.

    M.R. What are the most opportunistic and problematic trends that the book covers?

    Peter: We believe that every single megatrend we are discussing holds threats and promises, depending on the reader’s attitude to running a “business as unusual.” Moving from selling products to providing and monetizing the outcome of using the product (“Everything as a service”) can be viewed as a problem for a business – or it can be treated as a great opportunity to create and expand new revenue streams, develop new business models, and establish fresh customer relationships.

    Moving to a “circular economy” drives change in product design, supply chain, procurement practices, and product-end-of-life management in many industries. Whether this change is a reason for optimism or pessimism depends on whether this change is viewed as an opportunity or threat. And you will find the same duality in every single megatrend.

    Over the course of our research and the discussions with customers, partners, and SAP experts the opportunity/threat balance clearly shifted from seeing problems and challenges to appreciating the potential for innovation and new business relationships. And of course, we are very happy and pleased that our SAP solutions will play key roles in tackling the challenges and capturing the promised value from transforming business processes and business models.

    There are many digital technology trends – most prominently artificial intelligence – which we don’t feature in Business as UNusual with SAP as megatrends.

    Business as UNusual with SAP focuses on business megatrends and how they shape and change competitive playing fields and best business practices, or how they transform end-to-end business processes along value chains and across industry boundaries.

    Technology has always influenced, accelerated, and sometimes triggered business megatrends, and you will find that digital and other technologies and their impact are discussed in the context of each megatrend, from Lifelong Health to New Customer Pathways and from Integrated Mobility to the Future of Capital and Risk.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Dr. Yu Xu, Founder & CEO of TigerGraph

    By Article

    With 26+ patents in parallel data management and optimization, TigerGraph’s founder and CEO, Dr. Yu Xu, has extraordinary expertise in big data and database systems.

    Having worked on Twitter’s data infrastructure for massive data analytics and led Teradata’s big data initiatives as a Hadoop architect, not only does Yu have an impressive resume, but his ability to explain detailed concepts in a simplified way made for easy conversation.

    M.R. Rangaswami: Graph databases are gaining momentum as more organizations adopt the technology to achieve deeper business insights. What exactly is a graph database?

    Yu Xu: The world is more hyper-connected than ever before, and the ability to tap into the power of rich, growing networks – whether that be financial transactions, social media networks, recommendation engines, or global supply chains – will make or break the bottom-line of an organization. Given the importance of connections in the modern business environment, it’s critical for database technology to keep up.

    Legacy databases (known as relational or RDBMS) were built for well-mapped, stable and predictable processes like finance and accounting. These databases use rigid rows, columns and tables that don’t require frequent modifications, but are costly and time-consuming when adjustments need to be made.

    The graph database model is built to store and retrieve connections from the ground up. It’s more flexible, scalable and agile than RDBMS, and is the optimal data model for applications that harness artificial intelligence and machine learning. 

    A graph database stores two kinds of data: entities (vertices) and the relationships between them (edges). This network of interconnected vertices and edges is called a graph. Graph database software stores all the records of these interconnected vertices, attributes, and edges so they can be harnessed by various software applications. AI and ML applications thrive on connected data, and that’s exactly what graph technology delivers.

    M.R.: What’s the difference between native and non-native graph databases?

    Yu: As graph technology grows in popularity, more database vendors offer “graph” capabilities alongside their existing data models. The trouble with these graph add-on offerings is that they’re not optimized to store and query the connections between data entities. If an application frequently needs to store and query data relationships, it needs a native graph database. 

    The key difference between native and non-native graph technology is what it’s created for. A native graph database uses something called index-free adjacency to physically point between connected vertices to ensure connected data queries are highly performant. Essentially, if a database model is specifically engineered to store and query connected data, it’s a native graph database. If the database was first engineered for a different data model and added “graph” capabilities later, then it’s a non-native graph database. Non-native graph data storage is often slower because all of the relationships in the graph have to be translated into a different data model for every graph query. 

    M.R: What are some ways that businesses are leveraging graph databases?

    Yu: The use cases for graph technology are vast, diverse, and growing. If an application frequently queries and harnesses the relationships between users, products, locations, or any other entities, it will benefit from a native graph database. The same is true if a use case leverages network effects or requires multiple-hop queries across data.

    Some of the most popular use cases for graph include fraud detection, recommendation engines, supply chain management, cybersecurity, anti-money laundering, and customer 360, just to name a few. If your enterprise relies on graph analytics or graph data science, then it needs a native graph database to ensure real-time performance for mission-critical applications. 

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Ayal Yogev, Co-Founder & CEO of Anjuna

    By Article

    Ayal Yogev is the co-founder and CEO of Anjuna, the leading multi-cloud confidential computing platform. Ayal firmly believes that the best security solutions are enablers – they open up new opportunities that wouldn’t exist without a heightened level of security and trust. To achieve this, the industry needs a new way of thinking, building, and delivering applications that keeps enterprises in the driver’s seat and keeps their data protected at all times. 

    Ayal is passionate about giving companies the freedom to run applications anywhere in the world with complete data security and privacy. That’s why he co-founded Anjuna.  

    With over two decades of experience in the enterprise security space, Ayal shares his thoughts on how confidential computing will impact the cybersecurity landscape. He explains how confidential computing will be the antidote to today’s patchwork of ineffective security solutions, and how it’s poised to make security an enabler of innovation rather than an inhibitor. 

     

    M.R. Rangaswami: Can you explain what confidential computing is and why it’s now seeing increased momentum? 

    Ayal Yogev: The majority of today’s cybersecurity solutions focus on detecting a breach once it’s already happened, then dealing with the repercussions. However, this approach leaves applications and data extremely vulnerable. Confidential computing addresses this vulnerability by processing data inside a hardware-isolated secure enclave, which ensures that data and code are protected during processing. Even in the event of a breach, applications running in confidential computing environments are invisible to attackers and therefore tamper-proof. 

    Confidential computing has seen rapidly growing support from cloud service providers and hardware manufacturers such as Intel, AMD, Nvidia, and Arm because of its massive, positive impacts on data security. However, it’s largely flown under the radar because of the engineering feat required to re-architect workloads to take advantage of it. Prior to Anjuna, it would take significant developer effort to re-code an application to work in just one of the clouds and then you’d have to repeat the work for each cloud you wanted to use. This is a daunting idea for many enterprises and a big reason why adoption has been slow. But this is changing. 

    Similar to VMware with server virtualization, Anjuna provides a new specialized software layer that allows enterprises to take advantage of the new hardware capabilities without the need to recode. Ajuna abstracts the complexity of confidential computing CPUs and democratizes access to this powerful technology that will redefine security and cloud. 

    M.R.: Which industries and companies are adopting this technology and what are the impacts they’ve seen?

    Ayal: According to IDC, less than half of enterprise workloads have moved to the cloud. Regulated verticals like financial services are only 20% of the way into their cloud journeys, meaning that 80% of workloads remain on-premises. Although running applications on-premise is less scalable, more complex and typically more expensive than in the cloud, CIOs are prevented from moving to the cloud by security, because in the cloud data security and privacy becomes a shared responsibility between you and your cloud service provider. Confidential computing finally solves this fundamental issue by isolating code and data from anyone with access to your infrastructure. 

    The value of confidential computing is broadly applicable and I expect that a few years from now confidential computing will be how all enterprise workloads run. In the short term, we see most security-conscious and heavily regulated organizations as the early adopters. Anjuna, for example, works with companies in financial services, government, blockchain, and other highly sensitive industries. 

    M.R.: When can we expect to see this technology impact our daily lives? What will this look like?

    Ayal: Confidential computing is already present in our everyday lives – we use it to protect our phones, credit cards, and more. This is now moving to the server side, and in the future it will move everything to the edge, creating a world of borderless computing.

    Adoption of confidential computing is at an inflection point. The ecosystem of manufacturers and cloud services providers has already moved. Intel, AMD, ARM, Nvidia, AWS, GCP, Azure, Oracle, and IBM have already shipped, or are about to ship, confidential computing enabled hardware and cloud services. What we’ve been missing is the software stack that democratized access to these new powerful capabilities, making it easy to use it for all apps without modifications. 

    I expect that over time, confidential computing will become the de-facto standard for how we run applications. The impact on our daily life will be huge. With ensured data security and privacy, organizations will not only be able to move more applications to the cloud, but also safely adopt emerging technologies like blockchain or AI. Moreover, entire new use cases like cross-organization data sharing and analytics will now be possible with incredible benefits in a wide range of industries like healthcare, financial services, media, and advertising.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    Quick Answers to Quick Questions: Jozef de Vries, Chief Product Engineering Officer, EnterpriseDB

    By Article

    Meet Jozef de Vries, the mastermind behind the cutting-edge product development at EnterpriseDB (EDB), a pioneering company revolutionizing Postgres in the enterprise domain.

    With over 15 years of experience before joining EDB, Jozef has held various positions at IBM, including building the IBM Cloud Database development organization from the ground up.

    In this quick Q&A, Jozef shares how enterprises can leverage Postgres to cater to their database needs and how this open-source platform is shaking up the market.

    M.R. Rangaswami: In your opinion, how will Postgres disrupt the open-source database market?

    Jozef de Vries: Postgres already has disrupted the database market. The only question that remains is how quickly Postgres will take a majority share of the enterprise database market. EDB is exclusively focused on accelerating the adoption of Postgres as the database standard in the enterprise. 

    Combined, Postgres is the most loved, most used, and most wanted database in the world. According to StackOverflow surveys of developers, its growth is exponential in 2022 and beyond.

    Postgres is the fastest-growing database management system in what Gartner views as an approximately $80 billion market. EDB customers such as MasterCard, Nielsen, Siemens, Sony, Ericsson and others have made Postgres their database standard

    EDB builds Postgres alongside a vibrant community, disrupting the market with greater scalability and cost savings compared to any other system. With more contributors to Postgres than any other company, EDB delivers unparalleled expertise and power to enterprises looking to adopt Postgres as their database standard. 

    M.R.: How does Postgres (as an open-source object-relational database system) function?

    Jozef: Postgres addresses the widest range of modern applications more than any other database today. This means that enterprises that run on Postgres can fundamentally transform their economics, build better applications with greater performance, scalability and security. 

    When Postgres was designed at the University of California, Berkeley more than 30 years ago, its designers made sure that the underlying data model was inherently extensible. At the time, databases could only use very simple data types, like numbers, strings and dates. Michael Stonebreaker, one of EDB’s distinguished advisors and strategists, and his team made a fundamental design decision to make Postgres easy to add new data types and their associated operations. 

    For example, PostGIS is an extension of Postgres that makes it easy to work with geographic data elements, polygons, routes, etc. That alone has made Postgres one of the preferred solutions for mapping systems. Other well known extensions are for document stores (JSON) and key value pairs (HSTORE).

    This extensible data model, together with the ability to run on every cloud, enables Postgres developers to be enormously productive and innovative.

    Alongside a robust independent open-source community, we have made Postgres an extraordinary database, superior to legacy proprietary databases and more universally applicable for developers than specialty databases.  

    Open source mandates, flexible deployment options, risk mitigation and strong security will drive much broader adoption of Postgres this year and next. EDB supports this with built-in Oracle migration capabilities, unmatched Postgres expertise and 24/7 global support. We uniquely empower enterprises to accelerate strategies, move applications to the cloud and build new applications on Postgres. 

    M.R.: What are the factors accelerating or inhibiting the adoption rate of Postgres?

    Jozef: Purpose-built for general use, Postgres powers enterprises across a wider variety and broader spectrum of applications than any other database, making it the economic game changer for data. There will always be specialty applications that require specialty databases. But for an enterprise standard, developers and IT executives rely on Postgres for the widest range of support. 

    Postgres technology is extraordinary and is improving faster than competing technologies, thanks to the independent nature of the community and EDB’s relentless commitment to Postgres innovation and development. Our technology approach delivers a “single database everywhere” to any platform including self-managed private clouds and self-managed public clouds, but our fully managed public cloud is the most important accelerator. The fact that we simultaneously deliver breathtaking cost reductions is the icing on the cake.

    Additionally, the fact that more developers love, use and want Postgres than any other database in the world is an important “tell” on this prediction. 

    Developers and business leaders alike seek data ownership and control and they simply don’t have time—or money—to waste. That is why they need a Postgres acceleration strategy, and only EDB can provide that.  

    Inhibitors to the adoption of Postgres are primarily awareness, staff education and training — all areas that the C-Suite can play a big leadership role in changing. Great leaders recognize the need for expertise from a company that deeply understands Postgres and enables them to run data anywhere. That’s EDB. 

    Our business is built to remove barriers. Some of the biggest companies in the world including Apple, Daimler, Goldman Sachs, and others have already adopted Postgres as their database standard. It’s not a matter of if, but when the majority of enterprises will follow suit.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Rohit Choudhary, Co-Founder & CEO of Acceldata

    By Article

    Rohit Choudhary is the Founder & CEO of the market leader in data observability, Acceldata.

    Having data alone isn’t enough to deliver value to an enterprise – a report by HT Mint found that over 90% of the data available in the world today was generated over the last two to three years alone. But putting it all together is what drives results. For enterprises, data comes in all shapes and sizes—but in the era of hyper-information, disinformation can be equally destructive.

    Rohit credits his success to his engineering roots, continuous innovation, a humbling sequence of entrepreneurial learning from successes and failures, and cultural alignment that kept his team together for nearly 20 years. Fresh off its $50 Million Series C funding round, Acceldata is leading the charge for the data observability industry, giving operational control back to understaffed data teams while maximizing ROI.

    M.R. Rangaswami: What is Acceldata’s founding story and what led you to raise a significant $50 million Series C funding in the face of economic turmoil?

    Rohit Choudhary: My co-founders and I started Acceldata in 2018 after recognizing that a better solution was needed to monitor, investigate, remediate, and manage the reliability of data pipelines and infrastructure. Having built complex data systems at several of the world’s largest companies, it was clear that enterprises were trying to build and manage data products using tools that weren’t optimized for this task. Despite significant investments, data teams still couldn’t see or understand what was happening inside mission-critical analytics and AI applications, failing to meet reliability, cost, and scale investments. 

    Since our launch, we have seen tremendous company momentum and were fortunate to secure a significant Series C funding round in the midst of an economic downturn. As a result, I can confidently say we’ve built the world’s most comprehensive and scalable data observability platform, correlating events across data, processing, and pipelines to transform how organizations develop and operate data products. Our funding speaks to the true value that organizations across the globe are achieving with data observability, and we’re excited to push the industry even further into the limelight. 

     

    M.R.: What is the importance of having reliable and established data across the enterprise? What consequences will companies experience without it?

    Rohit: While an organization’s data is among its most valuable assets, data alone isn’t enough to deliver business value to an enterprise. Being able to piece it together to provide meaningful insights is what actually drives results and ROI. 

    With the migration of data and analytics to the cloud, data volume and data movement are more significant than ever. There is data-at-rest, data-in-motion, and data for consumption, each having different stops in the modern data stack that make it difficult for organizations to get a good handle on their data. Data reliability ensures that data is delivered on time with the utmost quality so business teams can make consistent, timely, and accurate decisions.

    In the era of hyper-information, disinformation can be extremely destructive. However, the quality and integrity of the data in hand are what define the return on investment for various analytics and intelligence tools. 

    M.R.: What steps can organizations take to structure a logical plan of action to manage, monitor, and demystify data quality concerns and data outages?

    Rohit: Data observability is the most logical plan of action to manage, monitor, and demystify data quality concerns, misinformation, and data downtimes. Software firms rely on observability as a solution to tackle data quality challenges and pipeline issues. Observability goes above and beyond just routine monitoring. It ensures teams are on top of breakdowns and manages data across four layers: Users, Compute, Pipeline, and Reliability.   

    Throughout the entire data process – from ingestion to consumption – data pipelines are moving data from disparate sources in an attempt to deliver actionable insights. When that data is accurate and timely, those insights help the enterprise gain a competitive advantage, and deliver the promise of an efficient data-driven enterprise. 

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    Quick Answers to Quick Questions: Mark Greenlaw, VP of Global Marketing Strategy, Cirrus Data

    By Article

    It’s been said that 2023 is the year hybrid evolves to multi-cloud for enterprises, driving the importance of data migration to the forefront of IT decision-makers.

    Data is the lifeblood of the enterprise and now its movements have become even more complicated. In our quick conversation, Cirrus Data’s VP of Global Marketing Strategy, Mark Greenlaw, shared his observations on what’s happening with data mobility speeds, flexible storage architecture, and multi-cloud transformations.

    M.R. Rangaswami: What are companies missing about how digital transformation impacts cloud adoption? 

    Mark Greenlaw: The phrase “digital transformation” has become a sort of catchall to describe everything from the process of modernizing applications to creating new digital business models. In reality, digital transformation is not replicating an existing service, but using technology to transform the service into something significantly better.

    Unfortunately, less than 20% of companies that embarked on digital transformation strategies have been successful. There are varying reasons for the lack of sustained improvements from transformation initiatives, but infrastructure challenges are among the top. The cloud offers relief from rigid on-premises environments and accelerates time to market.

    Public cloud companies now offer flexibility, access to third-party ecosystems, automation, and the ability to truly transform services.

     

    M.R.: What do you advise organizations consider before a multi-cloud strategy?

    Mark: Companies have been moving to the cloud for several years, but not all clouds are equivalent. As cloud adoption has grown, different cloud services are ideal for applications, workloads, and business processes. Today, many organizations harness a mix of private, hybrid and public clouds. Selecting the right cloud service and understanding how it integrates into your environment is an important first step.

    It can be a challenge to determine which cloud is right for each scenario, but once you’ve made that decision executing the migration is often a roadblock. A ‘lift and shift’ strategy without optimization, often doesn’t yield the ROI anticipated. We often hear from organizations that they are surprised by the costs of the cloud. And, once they have moved their workloads to the cloud, moving them between clouds can be cost-prohibitive without the right data mobility solutions in place.

    As part of planning a cloud strategy, data mobility needs to be a key consideration. What is the strategy to de-duplicate and compress your workloads? Do you have a solution that will enable you to move data while it is in use? Can you move data between clouds without exorbitant egress fees? These are all questions that when tackled at the beginning will ensure your program’s success.  

    M.R.: Is moving block data to a new environment a high stake move?

    Mark: Block data refers to mission-critical databases and applications which are structured data owned directly by applications. The loss of block data can have a catastrophic impact on business operations. Historically, storage experts would spend months planning the migration of this data onto a new storage platform. Legacy migration processes were manual, time-consuming, and prone to human error. For one customer in the travel and leisure industry, their initial attempt to migrate their block data took 18 months and they only managed to move a quarter of the overall traffic. It had a serious impact on their digital transformation plans.

    It’s also important to consider the difference between data migration and data mobility solutions. Data migration is for one-time moves from one platform to another. Data mobility allows organizations to move data between platforms accurately and without delays. Data mobility is essential to maximizing a multi-cloud strategy.  Whether you need to move your data for a specific project or you want the flexibility of continuous data mobility, automation and moving data while it is in use dramatically accelerates the speed of the process.

    When you can automatically throttle the migration speed around usage, you have the ability to reduce the time spent and bandwidth used by up to 6x.  Designing a strategy to manage your data mobility at the beginning of your cloud journey will lead to increased ROI and a better overall experience.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Armon Petrossian, CEO and Co-Founder of Coalesce

    By Article

    CEO and Co-Founder of Coalesce, Armon Petrossian, launched his company from stealth in January 2022 to solve the largest bottleneck in the analytic space: data transformations.

    The 29-year-old entrepreneur focused on helping enterprises overcome the pressing challenge of converting raw data into a more suitable structure for consumption, a process that can take months or even years, to meet daily organizational and operational data-driven demands. The company is currently going head-to-head with dbt Labs and Matillion in the data transformation space.

    M.R. Rangaswami: What are the core challenges you find that are associated with operationalizing data?

    Armon Petrossian: Companies have been struggling with data transformation and optimization since the early days of data warehousing, and with the enormous growth of the cloud, that challenge has only increased. Data teams, in particular, are challenged with the everyday demands of the business and the shortage of skilled data engineers and analysts to combat the growing volumes and complexity of data. 

    We are on a mission to radically improve the analytics landscape by making enterprise-scale data transformations as efficient and universal as possible.  We see the value of Coalesce’s technology as an inevitable catalyst to support the scalability and governance needed for cloud computing.

    One of the most rewarding aspects of my role at Coalesce is seeing the impact our solution has on organizations that want to drive value out of their data. This is especially true for companies that deal with complex data sets and/or are in highly regulated industries. 

    One of our most recent customer success stories involves partnering with an organization that helps big restaurant brand clients leverage their customer data to show that the brand knows and understands its customers. Helping its numerous clients improve their digital marketing funnel and offering customers a frictionless experience every time they visit the store, whether in person or online, relies heavily on data. This requires having the ability to glean useful insight from data quickly and easily. Coalesce, alongside Snowflake’s Snowpark, was able to help their data science team complete a high-profile transformation in one month, whereas before, the entire team spent 6 months without much progress.

    M.R.: What exactly is data transformation? Why does it play such a critical role in the future of data management and the analytics space?

    Armon: It’s important to look at how we consume data to understand why data transformations are so important. Initially, organizations that were adopting cloud platforms like Snowflake hit a major hurdle which was getting access to data from their source systems. As that problem has been largely solved by companies like Fivetran, and getting access to different types of data has become much easier, transforming that data to create a cohesive view is the logical next step for businesses to accomplish. This becomes dramatically more difficult as you begin to integrate data from traditional on-premises platforms, like Teradata or Oracle, along with a variety of different web sources. For example, companies may look at vast amounts of historical data to understand how their production line performs in certain scenarios or look into demographic information to target the right potential customers. Whatever the reason, the analytics are only as good as their ability to curate data from various sources and transform it into a consumable format for the analytics and data science teams.

    With Coalesce, the data can be organized in an easy-to-access and read fashion while using automation to streamline the process and limit the amount of time needed by highly skilled engineers. This ensures that companies are accessing high-quality data that is easy to use for a variety of purposes, an experience that is not guaranteed with existing tools. With our column-aware architecture, enterprises have the ability to efficiently and easily manage not only existing data but also new datasets as they grow and scale. 

    M.R.: What are your best practices for enterprises that are looking to keep up in today’s data-rich world?

    Armon: My suggestions for best practices can be broken down into four areas:

    i. Data-Competitive: Data competitiveness is key for every business, but given the enormous amounts of data being generated by modern enterprises, IT teams are falling behind in organizing and preparing data to be made available to business teams to help guide informed decisions.

    ii. Embrace the Cloud: Managing hardware or technology on-premises is expensive, time-consuming and risky. In U.S. history, cars were not nearly as impactful to daily life as a form of transportation until the infrastructure of roads was built across the country. We’re now seeing a similar economic boom with the way the cloud allows access to data for organizations that would have never been able to achieve similar use cases or value previously.

    iii. Evaluate Efficiency: IT teams finally understand how important efficiency can be to help deliver a continued competitive edge for enterprises. When applicable, data automation reduces time, effort, and cost while reducing tedious and repetitive work and allowing teams to focus on additional use cases with high-value data objectives.

    iv. Strive for Scalability: With more data and the proliferation of the cloud, organizations are challenged with scaling IT systems while maintaining flexibility and control. Companies should look to implement processes that offer the speed and efficiency needed to achieve digital transformation at scale and to meet increasing business and customer demands.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: AB Periasamy, Co-Founder & CEO, MinIO

    By Article

    AB’s Unicorn company has pioneered high-performance Kubernetes-native object storage, helping enterprises use the cloud operating model to determine where to run their workloads – depending on what they are optimizing for. 

    As a Series B company, MinIO has $126 million in funding raised to date, with a billion dollar valuation. Investors include Intel Capital, Softbank Vision Fund 2, Dell Technologies Capital, Nexus Venture Partners, General Catalyst and key angel investors.

    As one of the leading proponents and thinkers on the subject of open source software, AB is able to masterfully articulate differences between philosophy and business models – and how the two create cloud function.

    M.R. Rangaswami: Can you explain all this chatter about cloud repatriation?

    AB Periasamy: Simply put, the concept of “cloud repatriation” is repatriating workloads from public clouds to a private cloud. For years, the mantra of the cloud was fairly straightforward: put everything in the public cloud and keep it there forever. This model made sense as businesses optimized for elasticity, developer agility, service availability and flexibility. 

    Things changed when businesses reached scale, however, as the benefits were swamped by economics and lock-in. This is leading many enterprises to re-think their approach to the cloud – with a focus on the operating model of the cloud – not where it runs. 

    It’s important to remember the cloud operating model has a cycle. There are times to leverage the public cloud. There are times to leverage the private cloud. There are times to leverage the colo model. Given the ecosystem that has built up around the cloud – there is certainly self-interest in driving enterprise workloads in that direction – there are the consulting fees to get you there and the consulting fees to manage costs once you realize it is more expensive than forecasted. Nonetheless, sophisticated enterprises are increasingly taking their own counsel on determining what is best for the business – and that is driving the repatriation discussion. 

    M.R.: What are the key principles of the cloud operating model?

    AB: The cloud is not a physical location anymore. Today, the tooling and skill set that was once the dominion of AWS, GCP and Azure, is now available everywhere. Kubernetes is not confined to the public cloud distributions of EKS, GKE and AKS – there are dozens of distributions. MinIO, for example, works in the public cloud, private cloud and the edge. The building blocks of the cloud run anywhere.

    Developers know this. It is why they have become the engine of value creation in the enterprise. They know the cloud is about engineering principles, things like containerization, orchestration, microservices, software-defined everything, RESTful APIs and automation.  

    Understanding these principles and understanding that they operate just as effectively outside of the public cloud creates true optionality and freedom. There is no “one” answer here – but with the cloud operating model as the guide, enterprises create optionality. Optionality is good.

    M.R.: How has the cloud lifecycle changed and is repatriation the answer?

    AB: Early cloud native adopters quickly learned principles of the cloud. Over time, workloads grew and costs ballooned. The workloads and principles were no longer novel – but the cost to support the workloads at scale was.

    For enterprises, it has become clear that the value has been inverted by the costs of remaining on the cloud. This is the lifecycle of the cloud. You extract the agility, elasticity, and flexibility value, then you turn your attention to economics and operational acuity.

    Repatriation is but one tool. There are many. It is really about optimization. What you are optimizing for should help determine where you should run your workload. At MinIO, we are agnostic, you can find us in every cloud marketplace (AWS, Azure, GCP, IBM). You can find us on every Kubernetes distribution (EKS, GKS, AKS, OpenShift, Tanzu, Rafay). That is the definition of multi-cloud. 

    We talk about balancing needs and optimizing for workloads. Again, some workloads are born in the public cloud. Some workloads grow out of it. Others are just better on the private cloud. It will depend. 

    What matters is that when your organization is committed to the principles of the cloud operating model you have the flexibility to decide and with that comes leverage. And who doesn’t like a little leverage – especially in today’s economy.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    SEG’s 2023 Annual SaaS Report

    By Article

    As we round the corner on the first quarter of 2023, we thought it would be an appropriate time to check in and review Software Equity Group’s Annual Report.

    According to SEG’s report, SaaS continues to be an attractive asset class for private equity and strategic buyers. M&A deal volume in 2022 surpassed 2,000 transactions for the first time, a 21% increase over 2021.

    Private equity buyers with record amounts of dry powder drove volume and valuations, comprising nearly 60% of SaaS M&A deals, a record for annual activity, and accounted for some of the highest multiples in 2022.

    Public market indices across the board struggled to overcome the tumultuous macroeconomic landscape of 2022. While multiples continued to decline from the unsustainable run-up in 2021 (14.7x), public SaaS companies in the SEG SaaS Index demonstrated operational resiliency. The median EV/Revenue multiple sat 15% higher than 2018’s pre-pandemic levels, which were considered healthy at the time. What’s more, recent indicators show inflation moderating and the potential easing of interest rate hikes, which should bode well for SaaS multiples going forward.

    Here are 5 summary points to note:

    1. Private equity capital overhang and fierce strategic competition catalyzed SaaS M&A activity and buoyed EV/Revenue multiples
      in 2022, despite broader macroeconomic turbulence.
    2. SaaS M&A deal volume remains near peak levels, reaching 2,157 deals in 2022 and growing 21% over 2021.
    3. The median EV/Revenue multiple for SaaS deals jumped to 5.6x in 4Q22, surpassing the median SEG SaaS Index public market multiple of 5.4x. Buyers and investors paying a premium for high-quality assets bolstered valuation multiples for SaaS M&A in 2022.
    4. Private equity-driven deals accounted for the highest percentage of transactions to date on an annual basis (59.5%) due to the record amount of capital raised demanding deployment to worthy assets.
    5. Noteworthy deals include Adobe’s acquisition of Figma ($20B), Vista Equity’s acquisition of Citrix ($16.5B), and ICE’s acquisition of Black Knight ($16B).

    Click here to view SEG’s full 2023 SaaS Report:

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    Quick Answers to Quick Questions: Dominic Lombardi, Vice President of Security & Trust, Kandji

    By Article

    Sharing his on his list what organisations must pay attention to when it comes to their security, Kandji’s VP of Security and Trust, Dom Lombardi, details how organizations can stay one step ahead of this year’s risks, threats and potential attacks.

    M.R. Rangaswami: With the higher risk of infrastructure attacks, what will be the biggest thing to stay ahead of to avoid a concerted effort of attacks against organizations?

    Dom Lombardi: Attackers will continue to become more creative in their pursuits. It has been reported that about 25% of all data breaches involve phishing and 82% of breaches involve a human element. Many of the security controls we put in place earlier are at risk of being bypassed due to human error. Financially motivated cybercriminals will concentrate on corporate entities, where they will try to derive personal identifiable information (PII) or customer payment card information.

    Further, “strategic viability” attacks against critical infrastructure systems will continue to increase. Think oil pipelines, power generation, rail systems, electricity production, or industrial manufacturing. There is still the possibility that key government or corporate services could be targeted — something tied to global tensions.

    M.R. Why is it important for companies to prioritize Zero Trust in their cybersecurity
    plans?


    Dom: Security teams have been talking about the zero-trust cybersecurity approach for a few years. It used to be “trust, but verify.” The new zero trust — in a workplace filled with multiple teams, multiple devices, and multiple locations — is “check, check again, then trust in order to verify.”

    Organizations continue to play a cat-and-mouse game with hackers, attackers, and bad actors. Only 6% of enterprise organizations have fully implemented zero trust, according to a 2022 Forrester Research study.

    The complex and disparate workplace environments that are so common now make it difficult to adopt zero trust — at least all at once. If you are using AWS, Azure, and GCP with an on-premise instance along with a private cloud where you are running virtualization through VMware — that will take
    some time to uniformly roll everything out.

    As we all continue to embark on the zero trust journey, we will see new solutions for complex problems companies are experiencing on premise and in public and private clouds. By mastering basic IT (and security) hygiene, updating and communicating your risk register (a manual that outlines current and potential security risks and how they could impact the organization), and working steadily toward a zero-trust security model, you’ll be one step ahead of most other organizations — and hopefully two steps ahead of the hackers!


    M.R.: As companies continue to build their security plans, how will the role of the CISO
    expand at organizations
    ?

    Dom: The CISO can also (continuously) champion the risk register to ensure they receive needed resources to remediate and reduce risk on an ongoing basis. Keep in mind that new threats, risks, and updates will always populate your risk register. It is critical to actively work to remediate against this list to prevent risks from escalating and becoming even more complicated.

    Additionally, to prevent miscommunication and promote total transparency, any CISO who does not report directly to the CEO should demand that they do — immediately. Organizations need to take a risk-conscious approach to developing their security program and risk mitigation strategies.

    A CISO must report to the CEO to ensure direct lines of communication regarding risk scenarios and potential loss events. CEOs are ultimately accountable for the course of action they set the organization on, and CISOs provide the CEO with the direction and guidance to make informed, risk-conscious decisions.

    To set themselves up for success, CISOs should ensure that the general counsel at their organization is in their “peer set.” This relationship with your general counsel is integral to a unified approach to legal and security risk mitigation. The organization’s general counsel and CISO share a common goal: to keep the company, their customers, and the organization’s leaders safe.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Idit Levine, Founder & CEO, Solo.io.

    By Article

    Idit Levine, is the founder and CEO of the unicorn service mesh platform, Solo.io.

    Some say that it was her professional basketball career that gave her the drive to develop such a successful enterprise. Others report her success should be attributed to her ability to self-teach solution-based skills. Likely, it’s both.

    Idit’s dialled-in focus for how enterprises can connect hard-to-change, tested applications with more modern, flexible microservices and service mesh is worthy of her organisation’s 1 Billion Dollar Valuation.

    M.R. Rangaswami: Modern enterprise infrastructure and next gen technologies such as modern cloud native and microservices have basically eliminated the perimeter, so how has this affected security as a whole?

    Idit Levine: Not so long ago, a perimeter separated a company’s assets from the outside world. Now, there is no “inside” versus “outside”; everything is considered outside. A larger attack surface—the number of exposed and potentially vulnerable resources within your enterprise—means more opportunities for cybercriminals. And the average cost of a data breach in the U.S.? A staggering $9.44 million. Forward-looking organizations have implemented defense-in-depth (DiD), a multi-layered cybersecurity approach with several defensive mechanisms set up to protect valuable data and information. Others are implementing zero-trust, which basically means check, check again, then trust in order to verify.

    One of a modern organization’s biggest challenges is assessing exactly how many entities they must secure. Keep in mind that microservices and modern applications have exponentially more pieces than previous generations of applications. One microservice may contain 10 pieces while a previous application had only one. Once you break down these multi-part applications and services, you must factor in how all these pieces communicate over the network—a network that should be inherently untrusted.

    M.R.: Service mesh has long been thought of more as a DevOps solution, but can it too help with modern security?

    Idit: Service mesh tackles the prime challenges of developing and securing microservices and modern applications (different teams using different languages and frameworks) by moving authentication and authorization to a common infrastructure layer. The service mesh helps authenticate between services to ensure secure traffic flow, also enforcing service-to-service and end-user-to-service authorization. Service mesh enforces role-based access control (RBAC) and attribute-based access control (ABAC). A service mesh can validate the identity of a microservice as well as the resource (server) running the microservice.

    A service mesh also acts as traffic control within the network, freeing application teams to focus on building applications that benefit the business—without taking on the additional task of securing these applications. The service mesh delivers consistent security policies for inside and outside traffic and flexible authentication of users and machines. It also enables cryptographically trusted authentication for both users (humans) and machines or applications. Cryptographic security depends on keys to encrypt and decrypt data to verify and validate users. In addition to enabling encrypted paths between applications, service mesh allows for flexible failover (and improved uptime) and known points for security logging and monitoring.

    M.R.: Does zero trust have a play here? How should InfoSec treat a zero trust strategy?

    Idit: It’s been a year since president Joe Biden issued a cybersecurity executive order spelling out the importance of adopting a zero-trust cybersecurity approach, yet only 21% of critical infrastructure organizations have adopted a zero-trust model.

    The zero-trust approach is essential for fast-moving, cloud-native application environments. Many commercial organizations and government agencies are turning to service mesh to bolster their zero-trust initiatives. Government agencies, for example, always struggle to secure high-value assets (including critical infrastructure) from hackers and bad actors. And these attackers can be internal (disgruntled employees or contractor/vendor breaches) or external (foreign nation-state threat actors). As a result, there are no insiders or outsiders; everyone is outside and untrusted until proven otherwise.

    Service mesh is one of the simplest ways to enable zero-trust security. A service mesh helps authenticate and cryptographically validate and authorize people, devices and personas. It can further be used to enforce policies and identify potential threats.

    M.R. is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Krishna Raj Raja, Founder and CEO of Supportlogic.io

    By Article

    Today the customer support experience is critical to revenue at every phase of the customer journey, from pre-sales through renewal and expanding customer relationships. Krishna Raj Raja founded SupportLogic in 2016 to help transform the role of customer support, bringing deep experience in the service and support industry. 

    As the first hire for VMware India, Krishna built the company’s support organization into a multi-thousand headcount global organization. Now at the helm of SupportLogic, he and the company help some of the largest B2B technology companies in the world to optimize their support experience.

    M.R. Rangaswami: What are some of the trends driving the need for companies to focus on their support experience?

    Krishna Raj Raja: There are several key trends that are accelerating the need for every company to invest in the customer support experience:

    1. Velocity of technology adoption. It took 80 years for the invention of the telephone to reach 100 million users. Mobile phones took only 16 years and Whatsapp took less than four years. ChatGPT took only two months to reach that milestone. Not only is the rate of adoption faster, but we are also updating the technology at an increasingly faster rate. Both these trends stress vendors as it’s far more challenging to handle growing support issues without compromising the brand experience.
    2. Focus has shifted to post-land. The rise of the “subscription economy” and SaaS put the spotlight on customer retention. Today more and more companies are transitioning to usage-based-monetization models. The focus has now shifted from landing customers to driving product adoption post-land. Support plays a crucial role in this transformation. Customers are more likely to adopt a product that is backed by a world-class support experience.
    3. Product-led growth models for enterprise. This is part of a continued trend of “consumerization of the enterprise” which vendors may falsely assume means that it’s easy to design a perfect product that does not need marketing, sales and support to be successful. The opposite is true, in fact, and while some companies that are PLG-native may have an easier time, many companies who transition from traditional a sales-led GTM motion require even more investment in support experience to evolve successfully.
    4. Big Data vs. Thick Data. Big Data’s focus historically has been on metadata and machine data. This is the first time in the industry we can process unstructured data at scale. The ability to process customer sentiment and unlock the Voice of the Customer from support interactions has led to the rise of thick data. Emerging business trends can now be spotted in thick data that were previously untapped in big data analysis.

    M.R. Rangaswami: AI has jumped from being in the hype cycle to being a more mainstream technology. What role does it play in support experience?

    Krishna Raj Raja:  ChatGPT has recently gained much media attention and AI technologies in general have accelerated greatly to serve more real-world applications including Support Experience. Companies are using AI and Natural Language Processing (NLP) to mine and organize raw customer sentiment signals like “frustration,” “needs attention,” “looking at alternative solutions” and turn it into predictive scores such as “likely to escalate or churn” and guided workflow steps for support managers and agents to  coach, assign cases to the right agent and feed a more intelligent product feedback loop.

    The use of AI enables new levels of speed and precision to take the right steps to improve the customer experience at a scale of millions of customer interactions. 

    M.R.: How do companies make the business case for solutions like SupportLogic during an economic downturn, where all costs are significantly scrutinized?

    Krishna: In light of the current economic headwinds, every purchase is under a microscope and the business case must be rock solid. A few factors that are helping to move technology purchases forward:

    1. An ability to consolidate and reduce other technology spend – i.e. a typical company may be spending money on hundreds or thousands of SaaS applications that get marginal use. If you can demonstrate that you perform the bulk of use cases and the same value as a bunch of them, it’s an easier internal sell to finance leadership.
    2. Showing clear financial metrics and speaking the language of finance – e.g. calculations on how you help with Net-Dollar Retention, Margins, Customer Lifetime Value and the “Rule of 40” go a long way in getting support from finance and business decision makers.
    3. Demonstrating benefits across multiple functions/departments in the organization vs. being narrowly focused on one role or function.

    The good news is that investing in Support Experience with solutions like SupportLogic addresses all of these areas, making it a top investment priority for organizations that may be cutting back in other areas. We have content that walks through how to make the business case in more detail. 

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Shub Bhowmick, Co-founder & CEO, Tredence Inc.

    By Article

    With over two decades of experience advising on tech strategy, M&A integration and operations improvement, Shub Bhowmick’s career has thrived with building and running high impact projects in a wide range of industries.

    According to Forbes, Shub’s expertise favours his ability to breakdown complex problems, identify risks, assess business value and then provide recommendations on remediation/value attainment. All of which stemmed from his MBA at Northwestern University’s Kellogg School of Management, and a Bachelor of Technology with honors in Chemical Engineering from IIT-BHU in India.

    M.R.Rangaswami: Everyone’s betting on analytics and AI, how should a company evaluate an AI vendor?

    Shub Bhowmick: At a recent event, Reid Hoffman said, “You are sacrificing the future if you opt-out of AI completely.” The AI and data science industry continues to evolve at light speed, and this year will be no different. However, enterprises are adjusting their expectations as cost reduction and shareholder value realization are fast becoming a central theme.

    In light of the increasing importance of AI in business today, companies worldwide are justifiably spending more time and effort evaluating AI consultants. Data science solutions are more valued than ever before because they help companies differentiate themselves from the competition and spark organic growth.

    Identifying the right AI partner or solution can be challenging since everyone claims to be able to solve every problem, every time. First, it is important to know what problems your business is trying to solve; don’t go into this evaluation blindly—ensure that you have a clear list of what you need and what business goals you’re aiming to accomplish. Then, you need to take a closer look at your options: What problems are the various AI vendors solving (and how effective is their work)? What industries do they have experience with? Are they growing and innovating or standing still? Do they have a regional or global presence? Can they support a broad range of users?

    Ultimately, doing things at the edge is what the future is about. A combinatorial focus on innovation, customer-centricity, business value realization and custom solutions will help you find the best AI vendor for your organization.

    M.R.: What are the most effective ways for companies to use AI and ML to reduce costs and maximize profitability?

    Shub: AI and machine learning technology have quickly become integral parts of digital transformation strategies for businesses, as these solutions are essential for improved efficiency, cost-cutting and maximizing profits. AI has the potential to integrate everything within an enterprise from customer insights to hyper-personalization, order generation, warehouse inventory optimization, the right routing optimization, delivery, products shown on the catalog, POS data and finally to pricing. To illustrate their immense capability and potential further, let’s look at some real-world use cases. 

    For instance, a customer intelligence platform like COSMOS helps retailers get 360-degree visibility into the customer, both when they are with you and with the competition. The platform delivers real-time access to customer insights with seamlessly integrated first- and third-party data to run multiple experiments and perform holistic measurements.

    Similarly, the role of AI in CPG and manufacturing is significant, where a solution like supply chain control tower future-proofs supply chain with prescriptive insights and helps companies handle future disruptions and opportunities, with centralized control. 

    When used in collaboration, AI and ML can predict what products and services will be in greater demand so that businesses can maximize sales and growth opportunities while engaging fewer resources. AI and ML are designed to help companies decrease costs while growing profitability. This is just one of the many reasons more businesses are turning to the latest data science solutions.

    M.R.: What is the last-mile problem in AI and how can it be solved?

    Shub: The last-mile problem in AI is the critical gap between insight creation and value realization—it has long been one of the most challenging issues for organizations across various industries and continues to test companies today. While generating insights is certainly worthwhile, if you can’t use them to change behavior or move the dial, then that gap is both costly and unproductive for companies. 

    Tredence ensures insights are actionable and impactful so our clients can grow revenue, remove barriers to innovation and uncover new opportunities to create meaningful and sustainable value. Working with several Fortune 100 CDOs, we help enterprises understand the economic value of data and the importance of leading a data-driven organization. With all that in mind, our goal is to be on every CDO’s speed dial in the next 2-3 years. We excel at solving the last-mile problem and helping organizations create true value; with Tredence, you can solve vertical and horizontal issues.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Josh Lee, Co-founder & CEO, Swit

    By Article

    Josh Lee is the Co-Founder and CEO of Swit, a Project Work Management platform. Selected as the winner of Startup Grind Global Conference 2020 among more than 4,000 applications and ranked No. 1 project management software in the Usability index among 140 competitions at G2 marketplace in 2022.

    Recognized by CIO Review as Top 4 Remote Work Tech Solution together with Slack, Asana, and Monday in 2020 and officially recommended by the Google Workspace marketplace as Editor’s choice in 2021.

    M.R. : Why is collaboration space saturated and how does Swit differentiate its offering?

    Josh : Every team has different workflows and their own preferences for tooling. So each department even in the same organization can choose different tools that belong to the same software category. For example, IT teams want Jira for project work management while non-IT teams like HR, Marketing, or Sales prefer Asana or Monday separately for the same purpose.

    This freedom of choosing tools ended up creating departmental boundaries, making it hard for multiple teams to collaborate for shared goals in a project. The more projects anyone gets involved with, the more silos they will suffer from, juggling through too many point solutions while losing context. Under this fragmented work environment, checking task dependencies with other teams in multiple projects feels unattainable. Companies are struggling against disconnected systems. We’re now in a crisis of tool overload. Streamlining workflows across teams is impossible without stitching these disconnected systems back.

    So, we built Swit to provide a collaboration framework for cross-team projects by offering just the right amount of every common denominator of collaboration essentials from chat to task in one place. It’s designed for Employee Connection across departments so we can create a more connected employee experience in company-wide, cross-functional projects.


    M.R.: How have strategies for digital transformation changed before and after the pandemic?

    Josh : Throughout the pandemic, it’s become much harder to connect as teams. People are feeling more disconnected from each other, working in different places in different time zones. There are too many digital tools, and a notification-based chat only has failed to serve as a hub for 3rd party apps integration without distractions. Digital fatigue is now at an all-time high, leading directly to distrust, disengagement, inefficiency, and low productivity. Work synchronization should not depend too much on video call-based sync meetings. There’s just no question that we are digitally drained. In addition, new generations are looking for a unified work hub that enables asynchronous communication with efficiency and trackable collaboration with transparency to bring a more human, remote sense of belonging.

    The world is changing and all previous Digital Transformation strategies will not work in this new world. We need a digital twin for the company completely redesigned from the ground up as a true-to-life space that connects our work across systems and brings people back closer together. 

    Companies will not succeed in digital & cultural transformation by focusing on employee management but companies will succeed only by focusing on employee connection. Standing still in the comfort zone built pre-pandemic is not an option to survive and, needless to say, to thrive. Swit was born to connect people and work across departments and systems so that even large organizations can drive that connection beyond barriers and evolve their employee experience strategy more sustainably.


    M.R.: How do you adjust Swit’s GTM strategy during this economic downturn?

    Josh : We truly understand one good product is not good enough for scalability because one size does NOT fit all. This market is already hugely saturated with too many single-function point solutions. So we offer SaaS Integration Platform together so our clients can configure and customize the product to their needs, create user-defined bots, build and publish 3rd party app integrations, and automate all the necessary functions all by themselves. 

    Salesforce said, the Future of SaaS is SIP, and we’ve just brought the Future to Now. This configurable product and customizable platform offering has been optimized to help our users better be able to stay connected across teams and across tools. Internally, we call this PPLG – Product & Platform-Led Growth. We built the “product” to be industry-neutral with common denominators of every team’s daily workflows while the “platform” empowers our clients to meet their industry-specific needs by themselves.

    Fortunately, Swit is recession-proof because it’s an essential software that companies use consistently regardless of market fluctuations. Rather, Collaborative Work Management is the fastest growing category during the endemic.

    Even though we’ve been offering one work hub that consolidates chat and tasks in one place for 4 years since launch, we’ll also release single-function tiers in July, 2023 with much more affordable pricing plans, add 11 languages and their local currencies to target global markets.

    M.R. is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Co-Founder & CEO of Safe Security, Saket Modi

    By Article

    Saket Modi is the Co-founder and CEO of Safe Security, a Cyber Risk Quantification and Management (CRQM) Platform company. A computer science engineer by education, he founded Safe Security in 2012 while in his final year of engineering, along with Rahul Tyagi and Vidit Baxi. Saket enjoys trying global cuisines, photography, and surprising his friends by playing the grand piano.

    M.R. Rangaswami: What is it really like to work with John Chambers (an investor in your company) – what is the single most valuable advice he has given you?

    Saket Modi: It’s incredible to work with John. He’s someone who has seen the economy and businesses move and reshuffle not once but multiple times. The most valuable advice he’s given us at Safe is to focus on customers. It reflects in our core value of keeping the customer first, always.

    M.R.: What trends do you see in the Cyber insurance market: Who is buying, what about rates – say something the readers can take action on?

    Saket: In Safe’s 2023 Cyber Insurance market outlook, we observe a trend wherein premium rates stabilize. Insurance carriers are adapting to new ways of underwriting cyber risks with an evolving threat landscape compounded with improved cybersecurity practices and investment with end-insureds.

    Carriers have raised the bar for entry for cyber insurance, increasing the information security requirements for organizations to qualify to obtain coverage. Coming out of a hard market, we are now seeing more competition, with more carriers open to underwriting cyber insurance again.

    2023 is the year the cyber insurance industry will introduce “inside-out” underwriting. They will leverage continuous, real-time, and precise cyber risk insights to effectively link the cyber insurance policy with the insured’s cybersecurity posture. With two-plus years of significant premium increases amidst reductions in coverage, insureds who have been investing in cyber security want to be acknowledged and rewarded by their cyber insurance partners and are more willing than ever to share “inside-out” cyber risk telemetry in a non-intrusive way.


    M.R.: What are the top cyber risks you see in your customer base that are simple to mitigate for enterprises – with the highest ROI?

    Saket: I don’t think we know a simple answer here. While customers are worried about ransomware and data breach the most, they increasingly want to model different possible risk scenarios dynamically. It is no longer about which risk is the most probable hypothetically.

    Customers want to understand the reality of their cyber risk posture and act accordingly. Organizations we have interacted with understand that risk is subjective – varying with the industry, geography, and annual revenue. Security and risk leaders want to understand how their company is positioned in the present and compare their cybersecurity status with future cyber risk scenarios. That’s where Cyber Risk Quantification and Management (CRQM) solutions, such as the Safe Platform, help them.

    SAFE allows them to build custom risk scenarios in their environment – enabling them to demonstrate and measure the likelihood of their organization being breached, the financial impact of possible breach scenarios, and a prioritized list of actions to improve security posture and reduce risk in a manner that maximized returns on security investment (ROSI).

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Max Liu, Co-Founder & CEO, PingCap

    By Article

    Prior to founding PingCAP, Max Liu was a software engineer for more than 15 years. He spent many hours designing and trying to fix database scaling issues to make coding faster, including creating Codis open-source project as the distributed cache solution.

    Having first-hand experience with the time-consuming and repetitive processes engineers and developers face, Max developed TiDB, an open-source distributed database. TiDB offers a more streamlined database that can handle scalability and petabytes of data so customers can focus on more important areas like data analysis and business development.

    M.R. Rangaswami: What is HTAP and why is it important to the enterprise?

    Max Liu: Hybrid Transactional and Analytical Processing (HTAP) is a type of database that is able to process both Online Transactional and Analytical Processing workloads within the same system, sharing the single source of truth without data link delay in between. This allows for the simplification of technology stacks and data silos, which helps companies to build actionable data insight right from the real-time update and be able to drive for faster growth.  HTAP is important to the enterprise because it allows for more efficient and streamlined data processing, which can improve cost efficiency, and expedite business operations and decision making.

    M.R.: What are the biggest challenges for database and analytics management today and how should they be addressed?

    Max: The biggest challenges for database and analytics management today can be summarized  in three Vs: volume, variety, and velocity. The need to process a growing volume and variety of data, the need for real-time data processing and analysis, and the need to integrate data from multiple sources and systems. These challenges can be addressed through the use of advanced technologies such as in-memory databases, distributed databases, and cloud-based analytics platforms, respectively. In order to satisfy all three needs, a very complex data architecture has become a new norm.



    As the side effect, the challenge extends further into balancing between data capability and evolving velocity. Additionally, the lack of data engineering talent for such a complex architecture is a high barrier for most organizations to adopt a data-driven culture, investing in skilled personnel, and implementing effective data governance and security practices.

    M.R.: Where do you see the database market evolving in the next 5 years?

    Max: In the next 5 years, I expect the database market to continue to grow and evolve, with a focus on cloud-based solutions, the integration of artificial intelligence and machine learning technologies, and the development of distributed and scalable databases to support the growing volume and complexity of data. Simplified data architecture is likely to play a key role in this evolution, as it can help to reduce complexity and improve data accessibility, enabling organizations to gain greater insights from their data and make more informed business decisions.

    Additionally, there may be increased emphasis on data security and privacy, as well as the integration of databases with other technologies, which can be beneficial from the single database architecture again. Overall, I expect the database market to grow even faster with recent AI technology boosts, like OpenAI 3.5 or even the coming OpenAI 4. And the beauty of simplicity, for instance HTAP databases and low/no code tools, will become more powerful.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Peter Brereton, CEO, Tecsys

    By Article

    Peter Brereton is president and CEO of Tecsys, a global supply chain management software company. He joined Tecsys at its inception and initially led the company’s software development, product management, sales and marketing, and has been now serving as president and CEO for over 20 years.

    Having been recognised with an EY Entrepreneur Of The Year® award in Quebec in 2019, Peter leads with a strong moral compass rooted in family and faith. He has guided the company
    through tremendous growth, not only with a sharp vision for the supply chain industry, but by
    adhering to his fundamental values of authenticity and honesty.

    With 20+ years at Tecsys, Peter has a lot to share about growth, leadership and the future of tech in the healthcare space.


    M.R. Rangaswami: Let’s break down the functions of “Who” Tecsys is, and what do you do?

    Peter Brerenton: Coming up on our 40th anniversary, Tecsys sells and implements SaaS ERP
    and WMS solutions that manage agile supply chains. We’re currently at about a $150M run
    rate, with SaaS revenues growing at about 40% per year, and we’re profitable!
    Tecsys’ end-to-end supply chain software platform includes solutions for warehouse
    management, distribution and transportation management, supply management at point of
    use, order management and fulfillment, as well as financial management and analytics
    solutions.

    These solutions are designed to accommodate the needs of several industry
    segments; our customers include organisations spanning third-party and complex distribution,
    converging retail, healthcare supply chain, as well as government bodies and agencies.

    For decades, organisations have been adding length and complexity to their supply chains
    without paying attention to the vulnerabilities that those complexities create. Layer in digital
    commerce, globalisation, new consumer expectations and aging systems, and what worked in
    the past is likely less relevant now. In today’s rapidly changing world, an agile supply chain
    platform that can efficiently manage change is crucial to remaining competitive.

    That’s where we deliver our greatest value. Through our software solutions, we empower companies to run a modern supply chain practice with end-to-end visibility and the digital tools to adapt to change.

    M.R.: I hear that Tecsys solutions are truly transforming some aspects of healthcare. Can you
    explain?


    Peter: Over the last 10 years, Tecsys has proven that an efficient real-time digital supply chain
    platform improves cost, quality and outcome for hospital networks. Tecsys is the established
    leader in this market with more than 50 substantial hospital networks on their platform. We are
    currently adding an additional two or three per quarter.

    Tecsys championed a concept to manage a health system’s supplies following an industry best
    practice framework; it came to be known as the consolidated service center and is widely
    considered the benchmark for strategic supply chain management at a health system level.
    With dozens of major health system implementations under our belt, we continue to lead the
    industry in transforming traditional healthcare supply chain operations into modern clinically
    integrated supply chain networks.

    Another important facet of healthcare supply chain management is to facilitate collaboration between clinical and logistics teams to provide the best possible outcomes for patients. Because this is such an important part of the care delivery chain, we are highly focused on the
    deployment of clinically integrated point of use technology that connects clinical operations to
    the back-office supply chain activities needed to support patient care.

    At this point, we are the only vendor in the market that can tie the or, cath lab, general
    supplies and pharmacy together in a truly integrated supply chain along with the off-campus
    warehouse or consolidated service center. It turns out that having the right product at the right
    time for the right patient and in the hands of the right clinician saves lives and millions of
    dollars!


    M.R.: What do the next three years look like for Tecsys in the healthcare space?

    Peter: The healthcare industry is under pressure from both a clinical and operational
    perspective. With labor challenges and rising supply costs continuing to squeeze margin, this
    sector is facing a formidable challenge. The pandemic deepened and accelerated those
    challenges, exposing vulnerabilities and forcing transformation on healthcare organisations that
    were slower to adapt.

    Supply chain transparency and traceability will continue to drive investment in the healthcare
    sector. Health systems will keep evolving and growing, which means higher supply chain
    complexity, and increased challenges.

    The behemoth enterprise systems that worked well at the turn of the millennium are really
    showing their limitations now, and the urgency to modernise is just ramping up. There are 550 hospital networks in the U.S. and Tecsys is pursuing the top 300. Tecsys fully expects to have
    over 100 hospital networks as clients within the next three years on our way to more than 50%
    market share.

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Matt Gorniak, CEO, Threekit Visual Commerce

    By Article

    There’s a new age of B2B Growth and it’s all about the product experience and CEO of Threekit Visual, Matt Gorniak, is at the forefront of this era.

    Matt’s expertise comes from co-founding G2, cloud pioneers such as BigMachines (acquired by Oracle), SteelBrick (acquired by Salesforce and now Salesforce CPQ) and now, Threekit Visual. 

    Here is what Matt and his teams are seeing as they usher in a new age of B2B growth. 

    M.R. Rangaswami: Why is it the new age of B2B Growth?

    Matt Gorniak: The best way to view the world of B2B Commerce is in 3 stages.

    First, it was a world of spreadsheets and lots of manual processes. 

    Second, where most B2B companies are today – a world built around making it easier for the seller to sell. Tools like CRM, CPQ, ERP etc. make sales processes faster and more efficient for the seller. 

    Third is the new age of B2B where new kings will be crowned. Today it’s not about just making it easier for sellers to sell. What’s changed is that now it’s about making it easy for buyers to buy. 

    B2B winners will make it easy for customers to buy on their terms. They will show more or their product and deliver amazing, seamless, and efficient product experiences that keep their customers coming back. 

    M.R.: You mention moving from the age of “Seller to Buyer” – why is that important?

    Matt: It has been said but it bears repeating: everyone – and I mean everyone – wants to have an easier buying experience. B2B buyers really do want self-service as much as possible whether buying bulk gift cards or a forklift. 

    To complete a sale today most B2B companies need a salesperson to collect criteria, create a quote, send samples, do renderings and more.

    To compete and win in the future B2B companies will have a tool that allows buyers to configure, price, and visualize a product in real time.

    Buyers want to be able to literally see the product, be able to configure it, and get served up all the relevant pricing, quoting, delivery information. And they want it easily accessible, 24/7; with all of the product and customer rules baked in. 

    M.R.: How Does Threekit Visual Commerce help B2B brands level up to the new age of B2B Growth?

    Matt: Threekit creates a magical product experience for you buyers. Let buyers visually configure your product with a platform fully integrated with your tech stack 

    It works by taking your product catalog and rules and mapping that onto 3D assets. The platform delivers visual configuration in 3D, 2D, and AR so that customers can configure, build, and buy 24/7. 

    Threekit integrates with all of your systems like CPQ, eCommerce, and ERP – so buyers get an accurate price, delivery estimate, and other key information in real time. You can also syndicate the experience to distributors and resellers so they can sell more on your behalf.

    The future of B2B is different – it’s about the buyer. The new age of B2B winners will be the manufacturers that create a product experience which gives the buyer an accurate visual configuration along with all of the necessary information to buy now. 

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More

    M.R. Asks 3 Questions: Nick Cromydas, Co-Founder & CEO, Hunt Club

    By Article

    Happy New Year!

    Nick Cromydas is the Co-Founder and CEO of Hunt Club, a tech-enabled talent and recruitment company placing leadership roles across the fastest-growing companies in the tech sector. 

    Based in Chicago, Hunt Club has helped over 1,000 high-growth companies land incredible leaders, helping many of them scale their business from seed funding to unicorn status in a matter of years. By leveraging its technology, talent network, and the power of referrals, Hunt Club helps companies find their next great leader faster.   

    Prior to Hunt Club, Nick founded New Coast Ventures, a Venture studio that started or invested in over 50+ early-stage startups (4 unicorns) with material exits to companies like goPuff, Compass, and more.  

    M.R. Rangaswami: Is now the right time to hire, given multiple retractions in the SaaS and tech space? 

    Nick Cromydas: The short answer is, yes. 

    Historically, downturns are a great time to a) build a business and b) slow down to re-evaluate your business & focus on getting product-market fit right. We are seeing this shift through the hires companies are pursuing now, versus roles they focused on in previous years, where growth for growth’s sake was driving much of the market. 

    It’s really a tale of two cities. The media continues to share news around reductions, creating fear in the marketplace, but our team is on the ground helping as tech companies continue to hire. For the most part, growth-stage businesses have plenty of cash. 

    While the period of hypergrowth and high valuations has cooled off, over $643 billion in global venture investment was deployed in 2021, meaning startups are equipped with cash in their reserves. Founders are just being much more thoughtful in how they deploy those dollars. 

    Looking ahead, $290 billion in venture “dry powder” dollars are sitting on the sidelines with $162 billion earmarked for new investments in 2023. Analysts predict that this capital will be deployed next year, reinvigorating tech and startup capital, thus boosting hiring volume needs in the new year. 

    My sense is, as inflation eases and investors start to gain confidence again, we should see a surge in hiring across the tech sector in H1 and H2 of 2023. They might not hire at the same velocity we saw in 2021, largely due to the fact that capital markets won’t be as liquid, but strong balance sheets in Q1 and Q2 2022 are helping to sustain normalcy.

    MR.: What are the top talent trends you’re seeing right now? 

    Nick: Companies want to hire, strategically

    The volume of businesses with intent to hire has not gone down, they are just being much more thoughtful and intentional about spinning up new roles or growing teams too fast. They are also reprioritizing where to deploy dollars, focussing on roles like engineering, product and operations. 

    In 2021 and the first half of 2022, the scales were flipped. It was a candidate market and employers had to fight and open extra roles to build benches. With the Great Resignation followed by Quiet-Quitting, it was an all-out talent war to secure top talent. That combination created a deep deficit in top talent over the past two years – with many key roles left unfilled. Now, with the unfortunate workforce reduction taking place across industries, there is an opportunity for that talent to be absorbed back into the workforce, quickly. As a result we are having very strategic conversations with our customers about what positions they need to fill now versus where they can hold off to get by to maximize budgets. 

    This is particularly true in the tech sector, where unemployment most recently fell from 2.2% to 2% in November. There is a healthy tension between the number of open roles and the caliber of talent needed to fill those roles. 

    The hybrid work model has also driven the need for dramatic innovation, stumping many founders on how to transform themselves to keep up with changed behavior in a digital-first workforce. This means both search and internal talent acquisition needs to change and there has been very little innovation in the space that has achieved scale since LinkedIn. Without a playbook on how to build the best teams, Hunt Club has helped growth stage companies navigate through these changes, offering an effective way for them to reach top talent regardless of their own network or geography.   

    Another interesting point is that compensation levels have not materially changed due to layoffs and current market dynamics, and they do not show signs of coming down to pre-pandemic levels. In some cases, the salaries are higher than ever. In others, they seem to be on par since 2021. Geographically, the top 4 markets (SF/ Bay Area, NYC, LA, and Boston) have driven 68% of VC investments so far in 2022. These markets continue to hire across state lines due to remote work flexibility since the pandemic. 

    Demographic changes to the overall workforce are also causing a ripple effect. Aging baby boomers are increasingly retiring from legacy c-level role positions. At the same time, a supply deficit of digital-first talent is making it harder for companies to reach and secure the right people to lead. Companies can’t afford to get these hires wrong, making the need for innovation and accuracy critical to how they approach talent acquisition. 


    MR.: How are the best leaders handling a looming recession?  

    Nick: As we’ve scaled Hunt Club over the years, I’ve had the advantage of partnering and learning from top CEOs and investors who are building the companies of tomorrow, while dealing with the challenges of today. The leaders who can tactfully navigate the stages of discomfort and doubt, while staying focused on what’s most important – without creating unnecessary panic, end up on top.

    When we encounter a downcycle, we’re all looking for ways to reduce spending, while trying to keep the focus on growing product market fit – a juxtaposition that can feel daunting. Talent is not one of the areas where good leaders skimp out. In order to withstand recession volatility, the smartest companies are focused on making sure they have strong leadership in place to help guide and weather them through the storm.

    We are indeed seeing some slowdown across B2C and other sectors, but there are also pockets taking a counterintuitive approach to the macro-market, where hiring is still a top priority. For instance, leadership talent is top of mind for many growth-stage companies and we haven’t seen a drop in those roles. Savvy, forward-thinking founders are actively looking for experienced leaders who have managed through turbulent markets to help sustain and optimize operations. Going back to where we started, early-stage companies recognize that the best time to build a business is often in a downturn. A boomerang market is an opportune time to build foundational teams to drive future growth and scale.  

    M.R. Rangaswami is the Co-Founder of Sandhill.com

    Read More
    Copy link
    Powered by Social Snap