Skip to content

Automating Sales Outreach with AI: Building an Agentic Workflow Using LangGraph

  • 6 min read
  • by
sales outreach

Crafting an effective sales outreach email is a tedious and time-consuming task. Manually researching a target company, understanding its services, and identifying how your company can provide value requires significant effort. But what if we could solve this with an AI-driven agentic workflow? In this blog, you’ll learn how to build a sales outreach with AI by scraping key pages from target companies, summarizing their services, and generating highly relevant emails using LangGraph workflow.

How it works?

LangGraph workflow for Sales outreach with AI

In the agentic workflow, it scrapes key web pages from target companies that detail their services. Typically, pages like About Us and Solutions provide valuable insights into their offerings.

After extracting content from the scraped pages, the workflow generates individual summaries for each page. Then, it consolidates these summaries into a comprehensive final summary, providing a clear and concise overview of the target company’s services.

Using the generated summary and the source company’s service description, the workflow crafts a personalized sales outreach email. This ensures the message is highly relevant, effectively aligning the source company’s offerings with the target company’s needs.

Then if the user wants to modify the generated email he can give suggestions to the LangGraph workflow and it will regenerate the email again, LangGraph workflow will end.

Let’s Build a Sales Outreach LangGraph Workflow

Now, let’s build a LangGraph workflow step by step. But first, let’s install and import all the necessary packages.

pip install langgraph langchain-openai llama-index-readers-web validators
from langgraph.graph import END
from langchain_core.messages import SystemMessage, HumanMessage
from typing import TypedDict, Annotated, List
import operator
from langgraph.constants import Send
from llama_index.readers.web import BeautifulSoupWebReader

from langgraph.checkpoint.memory import MemorySaver
from langgraph.graph import StateGraph, START, END
from langchain_openai import ChatOpenAI
from langgraph.checkpoint.memory import MemorySaver

Now let’s define the state of a LangGraph workflow.

class ScrapeMaster(TypedDict):
    target_urls: Annotated[List,"The list of tartget URLs"] 
    summerized_url_text: Annotated[List[str],operator.add] # The list of summeries of scraped website
    target_summmary: Annotated[str,"The final summary"]
    source_text:Annotated[str,"the source text of the organization"]
    draft_email: Annotated[str,"The final draft email"]
    human_reviewer_feedback: str
    llm: object

Scrape and Summarize the content of the target website

Here we are going to scrape and summarize the content of the scraped text.

#activate the Scraping loader
loader=BeautifulSoupWebReader()

def scrape_and_summerize_text(state:SummerizeURLs):
    """This function scrapes the text from the given URL"""
    
    url_to_scrape= state['url_to_scrape']

    #scrape the url
    scraped_text = loader.load_data([url_to_scrape])[0].text

    sys_msg="""You are a professional text summarizer specializing in business insights. For the given text, 
    generate a concise  summary highlighting the business aspects, such as the solutions being provided, the target audience, and the value proposition."""

    result_summary=state['llm'].invoke([SystemMessage(content=sys_msg)] + [SystemMessage(content=f"Give me summary of {scraped_text}")])
    
    return {'summerized_url_text': [result_summary.content]}
    
    
 def scrape_summeries(state:ScrapeMaster):
    """This function scrapes the given URLs and returns the summary"""

    target_urls = state['target_urls'] 
    
    #call the function for each 
    return [Send("scrape_and_summerize_text", {"url_to_scrape": url,"llm":state['llm']}) for url in target_urls]

For this step, I have used BeautifulSoupWebReader from the llama hub. This gives the scraped text in the structure format. But you can use other packages as well. And then used the System prompt to generate the summary of the full text.

In scrape_summeries, the Send function helps to pass the multiple URLs from target_urls parallelly. Read this to know How to use Send function for more information.

Create a final summary of the URL summaries

Now in this step, we will combine the summaries and create a single summary out of them. Creating a summary also helps to reduce the cost of LLM calls if you are passing multiple URLs.

def summerized_all_summaries(state:ScrapeMaster):
    """This function Gives summary from all URL's summaries"""
    
    summerized_url_text = state['summerized_url_text']

    sys_msg="""You are a professional text summarizer specializing in business insights. For the given text, generate a concise summary highlighting the business aspects, such as the solutions being provided, the target audience, and the value proposition."""

    result= state['llm'].invoke([SystemMessage(content=sys_msg)] +  [HumanMessage(content="\\n".join(f"{summary}" for summary in summerized_url_text))])

    return {'target_summmary':result.content}
 

Generate a draft email

Now comes the most crucial function of the project. This function compares the provided source text with the target summary and generates the email out of it.

def generate_draft_email(state:ScrapeMaster):
    """This function gives a draft email"""

    #Fetch the source text
    source_text = state['source_text']
    target_text=state['target_summmary']
    human_reviewer_feedback = state.get("human_reviewer_feedback",None)

    sys_msg=f"""Draft a professional and persuasive email from [Source Text] to [Target Text], highlighting how the services offered by [Source Text] can enhance or complement the services provided by [Target Text]. 
                Ensure the email is engaging, value-driven, and customized to the recipient’s industry. Maintain a polite and professional tone while clearly outlining the key benefits of collaboration. In the subject include Source text's company name.
                Include a compelling introduction, specific advantages, and a call to action that encourages further discussion. Only give an email as an output, Nothing else.
                
                Examine any editorial feedback that has been optionally provided to guide creation on email. (The feedback can be None):
                {human_reviewer_feedback}

               Source Text:{source_text} 
               Target Text: {target_text}""".format(human_reviewer_feedback=human_reviewer_feedback, source_text=source_text,target_text=target_text)
    
    result = state['llm'].invoke([SystemMessage(content=sys_msg)] + [HumanMessage(content='Give me a draft email.')])

    return {'draft_email':result.content}

Here we have also introduced the human_reviewer_feedback to get the human feedback for generating email if any. So let’s also define one decision function.

 def human_feedback_func(state:ScrapeMaster):
    pass

def should_continue(state:ScrapeMaster):
    human_reviewer_feedback = state.get('human_reviewer_feedback',None)

    if human_reviewer_feedback:
        return "generate_draft_email"
    
    else:
        return END

This will execute the generate_draft_email once again if there is any human feedback otherwise it will exit the flow.

Create a LangGraph workflow

Now that our function are ready for the graph, let’s create nodes and edges and compile that as a graph.

workflow= StateGraph(ScrapeMaster)

#Define Nodes
workflow.add_node("scrape_and_summerize_text",scrape_and_summerize_text)
workflow.add_node("summerized_all_summaries",summerized_all_summaries)
workflow.add_node("generate_draft_email",generate_draft_email)
workflow.add_node("human_feedback_func",human_feedback_func)

#Define Edges
workflow.add_conditional_edges(START,scrape_summeries, ["scrape_and_summerize_text"])
workflow.add_edge("scrape_and_summerize_text","summerized_all_summaries")
workflow.add_edge("summerized_all_summaries","generate_draft_email")
workflow.add_edge("generate_draft_email","human_feedback_func")
workflow.add_conditional_edges("human_feedback_func",should_continue,["generate_draft_email",END])

#Define memory
thread_memory= MemorySaver()

#Compile the workflow
graph = workflow.compile(checkpointer=thread_memory,interrupt_before=['human_feedback_func'])

Demo of the App

For ease of use, I’ve built a Streamlit app on top of the entire LangGraph workflow. You can find the complete code on my GitHub.

Let’s consider a case where NVIDIA wants to collaborate with Intel for its AI services. The app will scrape Intel’s service-related pages, generate summaries, and craft a personalized sales outreach email to initiate the conversation. Let’s see it in action.

Conclusion

Manually crafting sales outreach emails can be time-consuming and inefficient, but with an AI-driven agentic workflow, the process becomes seamless and scalable. By leveraging LangGraph, we can automate web scraping, generate service summaries, and create highly personalized outreach emails, significantly improving engagement and conversion rates.

This approach not only saves time but also ensures that outreach efforts are more relevant and data-driven. Whether you’re targeting startups or enterprise-level companies, automating sales outreach with AI can give your business a competitive edge. Try implementing this workflow and experience the power of AI in sales!

Leave a Reply

Your email address will not be published. Required fields are marked *