GENERATIVE AI GROWTH AUTIOMATION: PRACTICAL USE CASES

Apr 12, 2024

Selling has always been about building relationships and creating value, and no advances in technology has ever changed this, nor will change this - but the scale, nature of the value and what you’re selling always changes, and always will.

Generative AI represents not just an opportunity to create new products and services to sell using existing growth and sales strategies, or to sell more effectively using Gen AI, but an opportunity to free sales leaders and other stakeholders to spend more time on building relationships that are not immediate sales, and on creating value that is not immediately monetizable.

The use of Generative AI to optimize sales funnels and boost your preferred three letter KPI is obvious, and will occur, this is not what we focus on at Digital Leverage, at least not solely. Such optimization use cases are important and do indeed create value, but they represent a limited vision, with no consideration of what is now unlocked thanks to Gen AI.

Fundamentally, there are two really valuable, no hype, functions of Generative AI that you should understand:

The ability to process unstructured data as structured data:

This is accomplished by function calling. Popularized by OpenAI, and now available in most models, it was originally intended to allow LLMs to use tools by calling external APIs. It is extremely useful for taking any unstructured text and processing it into a json object.

Mimicry of low level reasoning step by step: In my view, this is thanks to the nature of language itself, hence why I specify mimicry. This is an important distinction that I'll dive into below, but this allows for semantic automation and critically defines how you need to guardrail Gen AI-based systems and what to expect when using one.

The opportunities unlocked by these novel functionalities is incredible. For Growth Uses:

Automation and scaling of previously manual research workflows Ex: Web scraping lead websites and using Gen AI to classify by your ICP categories and return high value ICPs, use lead details in outreach messaging

Automation of entire tech stacks using semantic routing to orchestrate APIs Ex: Inbox & CRM automation. Classify all emails semantically, draft responses and update CRM deal stages, contact info, etc.

Programmatic personal value creation value for every lead Ex: Semi-personalize outreach messaging based on details, feed lead context into multi-step workflow to personalize sections of docs - like this paper ;)

These use cases of Generative AI are what we are excited about here at Digital Leverage.

What is Generative AI?

The technical explanation of Generative AI is fascinating, but frankly, you can have no idea how it works to use it effectively. That said, is is helpful to know:

How it Works Conceptually:

  • Generative AI is a subset of a larger area of work known as Deep Neural Networks.

  • Generative AIs models do not train on full words or images themselves, instead the data is tokenized (broken down into pieces that are interpretable to the model). This is where the phrase “next token prediction machine” comes from

  • Generative AI models ingest large amounts of data (either text, image, video), known as their training corpus and use matrix multiplication to fit that data to a pattern, and the completed trained model is used to predict the next token in a sequence, using that pattern. This is a simplification, of course, but it’s conceptually correct and useful for working with generative models.

What Buzzwords Mean:

  • GPT: Generative Pretrained Transformer. A type of transformer model, but not all generative AI models are GPT-based (some are Diffusion models). For more information, see a visual of a GPT here.

  • Training Corpus: the whole body of data assembled to train a model

  • Token: Atomic unit of data fed into an untrained model and generated by a trained one. Raw data is cleaned, labeled (most of the time) and then tokenized to be ready for training

  • Next-token Prediction: The goal of a Generative AI model. What is being optimized for.

  • Training: The process of feeding the training corpus

  • Inference: When a trained model is used to solve its goal (next token prediction)

  • RAG (Retrieval Augmented Generation): Inserting relevant information into your prompt to ensure the model generates higher quality/better results. RAG is usually done automatically and using tools like vector databases and embedding models.

  • Fine Tuning: Adding additional training data to an already trained model to improve it for a specific task

  • World Model: "A core concept in AI agent and decision making. It is our mental simulation of how the world works given interventions (or lack thereof)." Area of debate and interest as to whether models have a world model or not. Related links: here

  • Stochastic Parrot: "theory that large language models, though able to generate plausible language, do not understand the meaning of the language they process" paper link, discussion link

The Field Itself:

  • Generative AI, and AI research in general, is an actively progressing space. Well-renowned researchers have entirely opposite opinions on Gen AI model capabilities, how these models work and why they work. These are often heated debates.

  • Gen AI Model interpretability is a subset of research that focuses on understanding how and why a model produced a result. "Interpretability is the degree to which a human can understand the cause of a decision. The higher the interpretability of an ML model, the easier it is to comprehend the model’s predictions." aws intro

Practical Usage Of Generative AI

An Analogous Explanation of Generative based on practical experience is far more useful for our goals. I am no deep learning researcher, but I do consider myself an AI Implmenentation Engineer because I work with foundation models every day, whether to help me write code, in my own products or other use cases.

The practical expertise required to know how to integrate Claude Opus/GPT-4 via API, fine tune and self-host a model, prompt engineer, or create actually useful products and solutions with Gen AI models, is an entirely different skill set compared to creating Gen AI models from scratch, although they of course have overlap, and is still a brand new field.

The below is based on my own experience working with Generative AI models, and as I fully acknowledge, I am not a ML researcher, but you don’t need to be to learn how to best practically work with models to create value. This is merely meant to be practical advice, and I’ll bet some of these takes will be proven wrong/off, but I believe work well for practically deploying current Gen AI systems as real product/service level solutions to product/service level problems. Your mileage may vary using this advice.

For Practically Working with Generative AI models You Should Understand:

Context Windows:

  • Models have a limit on how much information can be processed at one time. Which is the length of text you can provide to the model that it can process to generate a response.

  • GPT-4 started with ~8k and ~32k, and now it is up to ~128k

  • Claude Opus has a window of ~200k.

  • Mistral 7B has a window of ~8k but can be extended, as it is open source.

  • Some models are better than others at accessing parts of, and using the context window, many have limitations related to leveraging text deep in the context window.

No Reasoning, Still Technological Marvel:

There is no reasoning occurring in any model, and as such, you have to find workarounds to get it to generate what you’re solving for. This is an area of debate at the research level, but I have found if you start with this assumption, it is far easier to get value from these models. Do not depend on it like you would a person, do not anthropomorphize. Things to know:

No reasoning means no negation: The model has no real understanding of yes or no.

Practically this means you need to learn how to phase all your prompts as if you were speaking to someone who did not really understand what "no" meant:

No ability to generalize: If an event, idea or other piece of information is not represented in the models' vector space - it cannot deduce or induce conclusions to that information. Models do not actually reason like a person would.

  • This is where RAG and fine-tuning, to a lesser extent, is a critical value-add for working with your own data

  • For larger context windows and smaller amounts of data, you can feasibly just inject your data and chain together API call outputs.

How to think about working with Gen AI models:

Every task, problem or anything you want to do with gen AI requires you and your knowledge, you are the driver still, and any way you frame it, you are still responsible for how you use model outputs.

This means detailed instructions, step by step examples and guiding the model in the right direction at all times.

Have a problem direction to solve for: You don't need to know exactly what you're looking for, but you need to be able to know it when you see it.

It's all about context. To ensure that your Gen AI creates what you actually want it to do, you must ensure you are feeding it the right context about your problem space. In this framing, prompt engineering is still an underrated solution. Provide timely, detail context in your prompt and restrict the model to solve one specific problem using that context prompt. It has a tremendous amount of information stored in its weights, but nothing specific to you and what you want, you must provide it.

Think of AI as:

  • General purpose tool for knowledge work

  • Enabler/Empowerment for you to explore new domains and get basic understanding

  • Amplifier of your current expertise and understanding

Mental Model for what Jobs to be Done to Automate and what is worth, and will always be worth, doing yourself. Article: here


Growth AI Automation Use Cases

Outbound Use Cases

Scaling Previously Manual Research Workflows

Value Creation: This is probably the most valuable as the rest of the use cases directly or indirectly depend on it. Any AI company/product is a data company/product - the degree of value you can create depends directly on your data and ability to leverage it.

The structured/unstructured conversion Gen AI enables and reasoning mimicry allows you to turn any digital object (website, pdf, etc.) and convert into an API and create knowledge work workflows.

A few example tasks that were not automatable that gen AI enables:

ICP Classification: You can use Gen AI to parse through scraped web pages and classify by the semantic meaning of the content. This is one way how I get Clay to classify leads by ICP.

Job Details Extraction: You can scrape LinkedIn job posts for your target industry and pass each one to GPT to see if the posting mentions budget or hiring. You can use this to discern growth rate and budget!

News Summarization: Ingest daily news articles and content from a lead or industry to use in timely outreach triggers. Chaining these all together allows you to build robust, repeatable and scalable data pipelines that otherwise would have had to be done manually. It also enables you to use this data to synthesize novel messaging/content down the line.

Why It Matters: At scale, these research workflows would take hundreds of hours to execute manually. Gen AI makes them cheap, feasible and scalable.

Honest Semi-Personalized Outreach with Genuine Value

Value Creation: By leveraging Gen AI, we craft outreach that demonstrates a deep understanding of each lead's unique challenges, not cheap "oh I know what college you went to, I'll have GPT write a poem about it." This approach moves beyond generic triggers and firmographics, creating a distinct and memorable interaction that provides real value upfront. The key to using Gen AI for outreach messaging personalization

Honesty: Nobody cares if you use Gen AI to help personalize parts of your outreach. It is still a cold sales email, not a heartfelt anniversary card. But people do care when it looks, and therefore is, low effort, cheap and you are not honest about how you are using AI. You don't have to disclose in every email parts of it were generated, although I suspect this is the direction we are heading in, but for now just own it and lean into it, no need to hide it. If someone asks, own it.

Still answering "Why Us?" and "Why Now?": Selling is still about addressing why the lead should buy from you, what is your value proposition/differentiation, and why they should buy now, what is pressing about the situation, what do they gain by buying now versus down the road. You can have someone's entire digital footprint to personalize towards (basically what FB ads try to do) and if the value isn't there it doesn't matter.

Why It Matters: In a landscape where standard personalization tactics have become commoditized, the ability to offer tailored insights and solutions sets your outreach apart, fostering long-term relationships over short-term gains.

Sales Ops Use Cases

AI-Driven Semantic Sales Ops & CRM Automation

Value Creation: Many of the CRM updates, notes, follow ups and other manual parts of the sales process are easy to automate, without AI - the problem that your specific sales process is not easy to automate. Semantic routing changes this - you can use Gen AI to update, create and update deals based on email content, update deal stages and basically any step by step knowledge work process that doesn't require creating any novel, just transformation. This means you can use your sales playbooks and brand guidelines, and draft personalized responses to common objections. This not only saves time but also ensures consistency in communication, reflecting your company's unique voice and expertise.

Why It Matters: Sales operations is not relationship building, it is not the end value provided to clients and customers; automating repetitive tasks allows your team to focus on the creative and relational aspects of sales that require a human touch, aligning with our view that AI Automation should facilitate, not replace, human intention and creativity.

Inbound Use Cases

Inbox Triage and Prioritization

Value Creation: Most of the emails you get are pure noise, some are interesting enough to warrant a response. The majority of the emails that warrant a response, especially in a sales context, exist with the same problem space, and are a derivative of the same lead question or objection you've responded to a million times before. By leveraging Gen to categorize and prioritize emails, and draft starter responses to these questions, you can reduce the time it takes to work through this group of messages significantly.

Why It Matters: AI is not going to replace relationship building, including over email. But it can help you save time handling the email equivalent of "this meeting could have been an email."

Content Use Cases

Personalized Lead Magnets

Value Creation: Combining your domain expertise with Gen AI, you deliver high-quality, insightful documents that are tailored to each prospect's needs, significantly boosting conversion rates.

Why It Matters: There is a growing and strengthening trend in b2b, and in general, where leads want to see a video demo/async proof of value before jumping on any call. This makes sense. So lean into demonstrating the value of your service/product up front.

Personalized Web Pages & Lead Touch Points:

Value Creation: This is a deeper play on the above and requires integration with your web stack, but imagine how cool it would be if you could land on a website, it ask you if you want a personalized experience (name, domain, and Linkedin) and it tailors the sections of the site to your needs.

Why It Matters: We believe this is going to happen for all digital touchpoints to a certain extent, AI is going to become the new form factor for computing. It would be incredibly valuable to leverage this trend to help leads understand why your solution addresses their problem, by showing them instead of telling them a generic value prop and asking them to imagine it. Show them up front.

Digital Leverage’s Approach for Building AI Automation Systems

High Quality Data + Semantic Routing + High Quality Prompt + Templates & Few Shot

  • clear problem statement: you must know what you are solving for, don't need to know how you will solve it, to what degree, but you must know why you want to solve it.

  • high quality data:

    • lead gen: basic starting point is at least firmographics + likely decision-maker

    • content acceleration: high quality reference articles

  • emails & docs: few shot examples of solved/good output sections

  • semantic routing:

    • map your problem space: what is inbounds? what is out of bounds? Set your goal and build a fence around it and fail out anything beyond that fence.

    • define how you solve: Step by step, this action or that action, enables strict high quality execution of your process by an LLM.

  • guardrails: create exit and failure points to ensure only problem-space specific scenarios pass through for processing

  • Knowledge Work instructions + high quality instructions:

  • Detailed Instructions: specific step by step detailed prompts that outline exact way to generate desired section

  • Few shot: examples of completed artifacts

  • Context Booster: preceding and succeeding text for context to ensure section fits seamlessly

  • templated artifacts:

  • Semi-personalization: only sections that require personalization and generation are generated

  • Surface area reduction: Reduce generation sections as much as possible. Simplify the document and section workflows.

Digital Leverage © 2024

Terms of Service

MISSION:LEVERAGE TECH TO FREE EACH OF US TO EMBRACE OUR HUMANITY.