Excerpts from the Gartner Session.
Thank you for stopping by. I began blogging 15 years ago, driven by an interest in technological innovation. Since then, my writing has spanned multiple domains, with focus in Digital, Data & AI, Cloud Transformation, and selective observations on Indian regional politics. :-) I’ve recently shifted to curating and sharing concise insights and noteworthy links across these areas. I hope you find the content useful and worth your time.
Excerpts from the Gartner Session.
While researching more on the GenAI for business Enterprise platforms, I came across the below link which highlights the Forrester Wave™ evaluation of cognitive search platforms (Algolia, Amazon Web Services, Coveo, Elastic, Glean Technologies, IntraFind, Kore.ai, Lucidworks, Microsoft, Mindbreeze, OpenText, Sinequa, Squirro and Yext).
It assesses leading providers based on 27 criteria, analyzing their strengths, weaknesses, and strategic direction.
The cognitive search market is undergoing a transformation driven by generative AI, with increasing demand for search-driven applications and the adoption of technologies like large language models (LLMs) and vector databases.
Vendors are evolving beyond basic search functionalities to offer robust indexing, intent-based search, and extensive data connectivity. The report highlights their capabilities in intent understanding, data integration, and advanced retrieval methods.
Leaders excel in intent-based search and LLM integrations, while other vendors focus on specific industry needs or customization capabilities.
As new players enter the market, buyers must prioritize platforms that offer comprehensive indexing, deep intent analysis, and strong data pipelines.
The report provides a comparative analysis, encouraging buyers to customize evaluations based on their requirements using Forrester’s detailed scorecard.
USPs of few leading Congnitive Search Engine Providers.
While exploring Knowledge Graphs, I came across Tony Seale's insightful series on Embracing Complexity - a fascinating read!
"...The Knowledge Graph is not a product that you can buy off the shelf. It is a way of organizing your data and algorithms in a unified network for each organization to bring its genius to this process..."
In a world of increasing complexity, organizations must move beyond traditional tabular data structures and embrace Knowledge Graphs—dynamic, interconnected networks that unify data, cloud, and AI. The articles highlights how graph-based data models unlock hidden insights by capturing relationships, feedback loops, and abstractions that traditional databases overlook.
Key tools like Graph Adapters, Data Services, and Graph Neural Networks enable seamless integration of diverse data sources while enhancing AI-driven decision-making.
A well-structured Knowledge Graph provides a holistic view of an organization, fostering systemic thinking, better change management, and more informed decision-making. As businesses accelerate into the digital age, building a Knowledge Graph is no longer optional but it is essential for survival and growth.
The graph-shaped data enables richer context, while a graph-shaped cloud ensures seamless connectivity across distributed data sources.
Traditional AI struggles due to its reliance on linear, structured data, but Graph Convolutional Networks (GCNs) offer a breakthrough by incorporating networked relationships into AI learning.
By embedding intelligence directly into the data-cloud network, organizations can unlock context-aware AI, enabling smarter predictions, systemic automation, and deeper insights. This network-based approach represents a paradigm shift, making AI more adaptive, intuitive, and lifelike for organizations navigating complex data ecosystems.
Its quite an informative read. Please read through all parts.
https://medium.com/@Tonyseale/embrace-complexity-conclusion-fb8be6f39debIt is is a bestselling guide that offers straightforward, practical advice on how to excel in sales and business development.
A "Rainmaker" is a term used to describe the seller who consistently brings in new clients, generates revenue, and drives business growth—an invaluable asset to any organization.
Key Lessons from the book:
๐ง๐ต๐ฒ ๐จ๐๐ง๐๐ ๐๐ง๐ ๐๐ฒ๐ฏ๐๐ถ๐๐ฒ ๐ผ๐ป ๐๐ต๐ฒ ๐๐ ๐๐ด๐ฒ๐ป๐ ๐บ๐ฎ๐ฟ๐ธ๐ฒ๐ ๐ถ๐ ๐ต๐ฒ๐ฟ๐ฒ — ๐ ๐๐ผ๐๐น๐ฑ๐ป’๐ ๐ฏ๐ฒ ๐๐๐ฟ๐ฝ๐ฟ๐ถ๐๐ฒ๐ฑ ๐ถ๐ณ ๐ถ๐ ๐ฏ๐ฒ๐ฐ๐ผ๐บ๐ฒ๐ ๐๐ต๐ฒ ๐ด๐ผ-๐๐ผ ๐๐ผ๐๐ฟ๐ฐ๐ฒ — ๐๐ต๐ถ๐ป๐ธ ๐ผ๐ณ ๐ถ๐ ๐ฎ๐ ๐ฎ ๐ณ๐ผ๐ฟ๐บ ๐ผ๐ณ ๐ช๐ถ๐ธ๐ถ๐ฝ๐ฒ๐ฑ๐ถ๐ฎ, ๐ฏ๐๐ ๐ณ๐ผ๐ฟ ๐ณ๐ผ๐ฟ ๐๐ ๐๐ด๐ฒ๐ป๐๐!
https://aiagentsdirectory.com/landscape
Source: Andreas Horn
AI agents are more than robots, software, or simple automation. They’re ๐ฎ๐๐๐ผ๐ป๐ผ๐บ๐ผ๐๐ ๐ฑ๐ผ๐ฒ๐ฟ๐ that can ๐น๐ฒ๐ฎ๐ฟ๐ป, ๐ฟ๐ฒ๐ฎ๐๐ผ๐ป, ๐๐ฎ๐ธ๐ฒ ๐ฎ๐ฐ๐๐ถ๐ผ๐ป, ๐ฎ๐ป๐ฑ ๐บ๐ฎ๐ธ๐ฒ ๐ฑ๐ฒ๐ฐ๐ถ๐๐ถ๐ผ๐ป๐. And currently everyone’s talking about them. Despite the growing buzz, real use cases are taking off already. Whether it is predictive maintenance in manufacturing, customer service, or AI agents disrupting software development, AI agents will be the autonomous specialists of the future.
The field is very dynamic and there are lots of new approaches, frameworks, and use cases added every week in this field. To keep a good overview the website below is helping a lot. It categorizes most existing AI Agent projects (450+) in the market and breaks them down into the following categories:
➤ ๐๐๐ถ๐น๐ฑ๐ฒ๐ฟ๐: 142 projects
➤ ๐ฃ๐ฟ๐ผ๐ฑ๐๐ฐ๐๐ถ๐๐ถ๐๐: 56 projects
➤ ๐๐ผ๐ฑ๐ถ๐ป๐ด: 55 projects
➤ ๐๐๐๐๐ผ๐บ๐ฒ๐ฟ ๐ฆ๐ฒ๐ฟ๐๐ถ๐ฐ๐ฒ: 42 projects
➤ ๐ฃ๐ฒ๐ฟ๐๐ผ๐ป๐ฎ๐น ๐๐๐๐ถ๐๐๐ฎ๐ป๐/๐๐ถ๐ด๐ถ๐๐ฎ๐น ๐ช๐ผ๐ฟ๐ธ๐ฒ๐ฟ๐: 58 projects
➤ ๐๐ฎ๐๐ฎ ๐๐ป๐ฎ๐น๐๐๐ถ๐: 28 projects
➤ ๐ช๐ผ๐ฟ๐ธ๐ณ๐น๐ผ๐: 20 projects
➤ ๐๐ผ๐ป๐๐ฒ๐ป๐ ๐๐ฟ๐ฒ๐ฎ๐๐ถ๐ผ๐ป: 19 projects
➤ ๐ฅ๐ฒ๐๐ฒ๐ฎ๐ฟ๐ฐ๐ต: 12 projects
๐ง๐ต๐ฒ ๐ฏ๐ฒ๐๐ ๐ฝ๐ฎ๐ฟ๐: ๐ง๐ต๐ฒ ๐๐ฒ๐ฏ๐๐ถ๐๐ฒ ๐ถ๐ ๐๐ฝ๐ฑ๐ฎ๐๐ฒ๐ฑ ๐๐ฒ๐ฒ๐ธ๐น๐ ๐ฎ๐ป๐ฑ ๐ถ๐ ๐ฎ๐น๐๐ฎ๐๐ ๐๐ฝ-๐๐ผ ๐ฑ๐ฎ๐๐ฒ! ๐ ๐ฏ๐ฒ๐น๐ถ๐ฒ๐๐ฒ ๐ถ๐ป ๐๐ต๐ฒ ๐๐ต๐ฒ ๐บ๐ผ๐ป๐๐ต๐ ๐ฎ๐ต๐ฒ๐ฎ๐ฑ, ๐๐ ๐ฎ๐ด๐ฒ๐ป๐๐ถ๐ฐ ๐๐ผ๐ฟ๐ธ๐ณ๐น๐ผ๐๐ ๐๐ถ๐น๐น ๐น๐ถ๐ธ๐ฒ๐น๐ ๐ฑ๐ฟ๐ถ๐๐ฒ ๐บ๐ผ๐ฟ๐ฒ ๐ฝ๐ฟ๐ผ๐ด๐ฟ๐ฒ๐๐ ๐๐ต๐ฎ๐ป ๐ฒ๐๐ฒ๐ป ๐๐ต๐ฒ ๐ป๐ฒ๐ ๐-๐ด๐ฒ๐ป ๐ณ๐ผ๐๐ป๐ฑ๐ฎ๐๐ถ๐ผ๐ป ๐บ๐ผ๐ฑ๐ฒ๐น๐.
๐ง๐ต๐ถ๐ ๐บ๐ถ๐ด๐ต๐ ๐ฏ๐ฒ ๐ฎ ๐ฟ๐ฒ๐ฎ๐น๐น๐ ๐๐๐ฒ๐ณ๐๐น ๐ฟ๐ฒ๐๐ผ๐๐ฟ๐ฐ๐ฒ ๐ณ๐ผ๐ฟ ๐๐ต๐ผ๐๐ฒ ๐ผ๐ณ ๐๐ผ๐ ๐๐ต๐ผ ๐๐ผ๐ฟ๐ธ ๐ถ๐ป ๐๐ ๐ฎ๐ป๐ฑ ๐๐ถ๐๐ต ๐ฎ๐ด๐ฒ๐ป๐๐.
Demos and presentations are an integral part of every presales and sales professional’s life. Quite often, I have encountered situations where teams are required to deliver demos at short notice with minimal information. Whether it’s showcasing POC work or presenting a POV as part of an RFP evaluation, demos offer a valuable opportunity to highlight key capabilities to stakeholders in a short span of time. Solution demos play a crucial role in sales conversations, and making them impactful can often be challenging. However, if not executed well, they can have significant consequences.
A partner once recommended this book '"Great Demo!: How to Create and Execute Stunning Software Demonstrations" by Peter Cohan' to me, and I found it incredibly insightful.
I highly recommend the book to my colleagues in Presales and Solution teams, especially those involved in product, solution, and POC demos.
Gathering the necessary requirements from prospects is often tough, and with limited time, we must make the most of every opportunity. One of the common dilemmas we face is deciding what to showcase and what to leave out.
The book “Great Demo!" is about shifting the focus from a feature-heavy presentation to a problem-solving, customer-focused narrative. By keeping the customer’s needs at the center of the demo, the approach helps sales teams deliver more compelling presentations, shorten the sales cycle, and close deals more effectively.
This book provides a structured approach to software demos, focusing on demonstrating products in a way that captures customer interest, engagement, and buying intent. Below is a detailed summary of the core principles and strategies from the book to help guide effective and compelling software demonstrations.
This principle emphasizes delivering the primary value or result the customer seeks right at the beginning. Instead of starting with extensive background information, jump straight to what the customer wants to see. This approach captures the audience's attention quickly and immediately establishes the product's relevance to their needs.
Peter Cohan stresses the importance of identifying the prospect's Critical Business Issue (CBI) before creating or delivering a demo. Understanding this core problem ensures that every feature or functionality shown in the demo directly addresses the customer’s most pressing needs.
A successful demo should clearly present:
This structured workflow for software demos consists of four key steps:
An effective demo should not be a technical walkthrough but a focused presentation on how the product provides value to the customer. Rather than overwhelming them with features, concentrate on functionalities that directly solve their problem.
Cohan suggests using the inverted pyramid model, common in journalism, where the most important information is presented first, followed by supporting details. This keeps the audience engaged from the beginning and allows them to leave at any point without missing key takeaways.
Deliver a Minimal Viable Demo, showcasing only the essential parts of the product that directly address the customer’s issue. Avoid spending time on features that do not contribute to solving their critical business problem.
These demos are designed to help the customer envision success. By illustrating how your solution fits their specific needs, you enable the prospect to see the value it can bring to their organization, fostering early buy-in and commitment.
Encourage interaction by asking questions, seeking feedback, and allowing the prospect to use the software. This makes the demo feel like a collaborative exploration rather than a one-sided presentation, enhancing engagement.
Cohan recommends addressing customer questions and objections openly during the demo. Rather than avoiding tough questions, tackle them head-on while maintaining control of the demo flow, linking responses back to the product’s key value proposition.
Just like a movie, a software demo needs a storyboard. Plan out what you will show, the sequence of steps, and how you will communicate each point. This ensures a smooth, logical flow that keeps the audience engaged.
Keep the flow of the demo simple, avoiding unnecessary complexities. The demo should be easy for the audience to follow, focusing on how it solves their problem, not just how it functions.
Present the demo in the context of specific customer scenarios or use cases. By demonstrating how the software solves a real-world problem, the prospect will have a clearer understanding of the product’s value.
Throughout the demo, emphasize and repeat the key benefits and outcomes the customer will gain. Reinforcement ensures that these points resonate with the audience and that they leave the demo with a clear understanding of how your solution helps them.
At the end of the demo, summarize the main points and reiterate the value your product brings. Remind the prospect of the Critical Business Issue and how your solution resolves it.
After the demo, follow up promptly with materials that reinforce the key takeaways. These could include:
Build on the demo by creating a tailored proposal that addresses the specific needs and goals identified during the demonstration. This ensures continuity in the sales process and maintains momentum.
https://www.youtube.com/watch?v=Gn7GDQeG0vQ
I'm sure that since ChatGPT went mainstream, you've been hearing the term LLM quite frequently. The article below provides a clear and insightful explanation of Large Language Models (LLMs) and the concepts of tokens and embeddings.
The article explores how LLMs process text by converting it into numerical representations. It first explains why text must be transformed into numbers for machine learning systems, emphasizing that tokens—the fundamental units derived from text—are mapped to unique numeric identifiers.
While words might seem like natural token candidates, the article highlights that tokens can also be sub-word units, offering greater flexibility in text representation. This approach helps address challenges such as case sensitivity and the emergence of new words, which can complicate text processing. By breaking text into smaller components, like characters or sub-words, LLMs can handle linguistic variations and nuances more effectively.
The article also delves into embeddings, which are vector representations of tokens that capture their meanings and relationships in a continuous vector space. These embeddings allow LLMs to understand context and semantics, enhancing their ability to perform tasks like language generation and comprehension.
Overall, the piece underscores the crucial role of tokenization and embeddings in improving LLMs' capabilities in natural language processing (NLP).
https://msync.org/notes/llm-understanding-tokens-embeddings/
Came across a solid CXOTalk discussion: “Why AI Works, but Your Strategy Isn’t.” Host: Michael Krigsman Guest: Sangeet Paul Choudary Lin...