218 | Zero to Leads: How to Build a Personalized LinkedIn Outreach Machine for Your ICP with Anna Bui
40m 4s
In this live session, automation coach Anna Bui is walking us through the exact step-by-step process she used to scrape leads from LinkedIn using Appify, enrich them with data, and send them through a highly targeted outreach workflow, all while keeping it lean and cost-effective for early-stage experiments. This isn’t theory. It’s real-world, been-through-the-struggles, here’s-what-actually-worked automation.Anna doesn’t just build tools, she builds systems that learn and adapt to your personality. You'll see how she overcame common roadblocks (and not-so-common AI frustrations) using N8N, Airtable, Clay, and Claude, plus how to personalize your ent...
Transcription
8210 Words, 43528 Characters
Hello, and welcome to another live episode of the Leveraging AI podcast, the podcast that shares practical ethical ways to leverage AI to improve efficiency, grow your business, and advance your career. This is Issar Metis, your host, and I'm really excited today. I'm really excited for two different reasons. Reason number one, we haven't done one of these lives in a very long time. Well, not very long, I guess for most people, maybe not that much, but about a month and a half since we've done the last live, and we used to do this every single week, but I was on vacation in Israel, and that's a very different time zone, and it would have made the whole thing a lot more complicated, but we're back. We're gonna be back doing this every single week, unless I'm traveling, so if you are with us live, first of all, thank you for being here. If you're not with us live, and you're listening to this as the podcast, or watching this on YouTube afterwards, we're going to be back to doing this every single Thursday at noon, Easter time, with an amazing expert that's gonna teach you how to do something very practical, very effective, with AI and other tools. And so, if that's what you wanna learn, come join us, because A, you get to hang out with the cool people, and B, you'll be able to ask questions, which if you're just listening to this afterwards, you cannot, which leads me to the second reason I'm excited is our topic today. You heard me say on this podcast many times before that you can have a business without marketing, you can have a business without HR, you can have a business without finance, you can have a business without operations. You cannot have a business without clients, because without clients, you don't have a business. That's the whole point. And so, to get clients, you need leads, and to get leads is not necessarily an easy thing. And today, in today's episode, we're going to dive deep into how to use AI tools, together with NA10, which is my favorite automation tool, to get leads, meaning look through relevant people on LinkedIn and reach the information about them so you know who they are, and create relevant, personalized outreach messages that drive engagement, that lead to a much higher conversion rate, which leads to clients, which is what you need in order to run your business. And so, if you are in business, that should be of a high interest to you, because, as I mentioned, this is the bloodline of every single business. And our guest today, Anna Bui, has built a really cool, amazing automation that does each and every one of the steps that I just described previously, basically getting you from, I don't have anybody, to I have a conversation with a long list of people. Now, in addition to that, Anna is a clay.com expert, and she has a course where she teaches people how to use clay. Those of you who don't know clay, it's an incredible tool that does a similar process. It does, shows you how to, or it allows you to grab people from different sources, and then you reach the information, and then reach out to them. So if you're asking yourself, why does Anna need a separate process other than clay? Well, first of all, I will let her explain probably more in detail, but the main reason is clay is expensive, and unless you get a very good return on ROI on your clients, then there might be a different way, and it also is a good entry point. So the process she developed allows you to do this without risking more or less anything. It's practically free to use NA10, and with a few tokens running through the APIs, you can do the process, figure out how effective it's working for you, finesse it, and then go to Anna and take the course on how to implement it on clay. So with all of that in mind, I am really, really excited to welcome Anna to Leveraging AI. Anna, welcome to the show. In the next few years, AI technology will change our world dramatically. Whether you are a business executive trying to catapult your business forward, or just somebody who refuses to be left behind and want to advance your career, this is the show for you. I'm your host, Isar Matis, a serial entrepreneur and an AI enthusiast. You'll hear invaluable practical tips from innovative business leaders, AI practitioners, and some of the world's leading AI companies. You'll also hear from some of the brightest AI minds in our world today on how you can leverage AI in ethical ways to advance your career and grow your business. Thank you so much, Isar. And so glad that everybody joining our LinkedIn line right now, when I'm going to show you the workflow and going to dive deep into not just the technical, but also all the problem and all the headache I got from building this flow as well. And just a little bit about myself. I was working at a company called A.I. And just a little bit about myself. I was working as a project manager where I tried to dive into automation just to make the team life easier at my previous company. And then I got into Zapier and make very popular. And then I make a shift to end-to-end and I become literally an end-to-end enthusiastic. I finally become recently an end-to-end creator officially. So you probably will see my verified profile later on. But basically I got, I become a coach at Clay Bootcamp where I teach people and show people skill on how to use the Clay as a tool. And now we're doing automation, especially with end-to-end because it's also my expertise as well. So I'm really looking forward to continue our session now. And then later on when people, if anyone having any questions regarding Clay or automation, I love to answer them as well. Awesome. And relating to the people who are joining us live on LinkedIn and or on Zoom, feel free to ask questions. So I'm monitoring the chat on both conversations. If you have any questions, please just pop them in the chat. I will get them to Anna's attention and we're gonna answer all your questions. Thank you obviously so much for being here. I'm sure you have other stuff to do on Thursday morning, afternoon, night, depending where you are in the world. And with that in mind, it would be great if you introduce yourself in the chat, just say where are you from, what are you looking to get out of the session just so that people can, you can meet each other and network as well and share links to your LinkedIn if you're on Zoom and so on. But Anna, let's get started. Let's dive right in. Yeah, I'm just gonna share my screen quickly. Please do. Okay, perfect. So this is my workflow. First of all, you can grab this workflow on this template right here. Right now it's a tree, but I'm trying to build more and more, but here is the workflow. This is a template where you can just go in and grab this for free. This is exactly, it is this workflow, but I just add data to the dummy data to that one. But again, what this workflow does, okay, I had to scale it back a bit. When I joined Clay Bootcamp, Nathan Lippy, which is our CEO, he encouraged me to post more on LinkedIn. I never touched LinkedIn before. I never, like when I have a profile, I never touch it and I post anything. And since I like, okay, I will start posting just the things that I built, like the end-to-end workflow or my opinions when it's got into, you know, the platform itself or like different automation. I just, you know, using the platform to showcasing my knowledge and what I've built because I think it's, I don't have any thought. I was just like, here's what I did. And the impact that's from all the posts and then the encouragement from everybody within my LinkedIn post is definitely overwhelming. Hence, I was building this workflow just because, okay, I have not like a viral post. I have a quite popular post. And I now consider LinkedIn is a warm lead, right? I believe it's you in sales. You probably agree with me that that is your nurture sequence. That is where you nurture people. That's where you interact with people. That's where prospect find you. That is basically where people heard about you when they come to you. A lot of inbounds coming in. So I got this somewhat popular post. I'm not going to say it's popular, but basically it's have, this is this one. So I have 79 people interacting, like interact with me, this sort of reaction. And I was like, okay, I have all these people interested in what I, in my post. So I was just wondering, let's just consider them as leads. Let's consider them as leads and then from, because they are indeed people that interact with my post. So maybe some of them would probably enjoy my content or probably become my ICP. Again, I'm not a business owner. So this is all like, I'm just take myself into a position of an agency owner or a small business owner. And my ICP are people that are like small business or small agency or solo entrepreneurs that need help with automation. So that would be my ICP. Hence from that, I just want to say, okay, all the people that interact with my post, maybe like 50 something percent of them are probably my ICP. So how can I actually capture them and then nurture them from so far and so forth and probably become a prospect and then closing the deal and all those sort of things. That was my initial thought with this workflow. Any questions, anything Isar? No, so two cents. First of all, on a very high level, what is any 10 or make the tools that you mentioned? So there are automation tools have been around for a very long time, right? So I've used Zapier the first time around 2015. So it's about a decade. And what automation tools know how to do is they know how to move data from one software to the other. So you have a contact on LinkedIn, you can grab the information, put it in your CRM. You can take the data from your CRM, put it in an Excel file, then in your Outbox, right? It's like these kinds of things. What these tools did not have any ability to do is to reason, think, research all the things that AI is good at. And now the combination of these traditional automation tools together with AI capabilities, make them superpowers if you know how to use them. Two cents about make versus Zapier versus N8n. Make is probably the easiest to use. Zapier is a step above that with more capabilities, but mostly a lot more connectors. But to be fair at this point, it doesn't really matter because you can almost connect anything you want one way or another. And N8n is more of the geeky brother of these two tools. On one hand, the learning curve is steeper. Like it's more complicated to use and that's why it shouldn't probably be your first entry point. But because the way it's built, it's significantly more flexible. You can do a lot more things with it. You're practically almost unlimited with what you can do with it because it can run code and it can get any kind of webhook. And the other reason why it's becoming very, very popular is that it's open source. And there's a huge community who's developing additional capabilities for it. And you can self-host it, meaning you can actually grab the code, host it on your own hosting server, and then your data doesn't go to make or to Zapier, it just stays in the box that you're hosting. And it makes it a lot cheaper because now you're paying $6 a month for hosting. It doesn't matter how many automations you're actually running. So there are many different reasons why to use N8n. That being said, I wouldn't start with N8n just because it's more complicated, unless Anna, you disagree. I agree with you. I started Zapier, I started Make, and thank you so much for scaling things out. I was just like, jump right straight in because- No, no, no. So now we can jump straight in. So, and you can actually go through the steps on how this thing works. But I completely agree with you. The reason why I avoid Zapier because it tends to get more expensive. Yes, it's had a lot more connector. It's going to make your first step into automation a lot easier because it's very straightforward. It's just step-by-step like a waterfall. It can do a lot, but it's going to get very expensive the more workflow you're trying to run. So that's why I, same as you, with everything that you just listed, that's why I choose N8n as my favorite automation platform, from like you can self-host it, or you can have this clouding as well. And one of the things I, probably in the future, you'll probably see more and more is the MCP server. You can host the MCP within the cloud or like having on your local hosting itself. So it's definitely a very powerful platform, but one of the things that people keep reaching out, sorry, before we jump into the flow, people reach out to me and ask, how can they start with N8n? I think the best way to start is just grab one of the templates that you see on the community. They have more than 4,800 template. So just, you can search it out. And then even with your tech stack as well, just grab a template and play with it. And the more you play with it, the more you feel that it's very straightforward. The community of N8n is very powerful as well. Everybody contributing, you will see a lot of people posting about their workflow, about their problem. And everyone that I ever encountered that enjoy building things on N8n, they're very kind. So please feel free to just reach out to them, asking a bunch of questions. But I highly recommend you just jump straight in and just start playing with the template. It doesn't have to be perfect because they are template, you're supposed to modify them to suit your need. So the more you build a workflow that solve your own problem, the more you learn it very quickly. I 100%. I'll add one more thing and then we'll dive into the actual flow. When you do grab these templates, you don't have to use the entire template. So you can consider the building blocks of the template. Say, oh, this segment, these four steps, they do this thing as part of the 20 steps that the automation in the template do. And I can use these four steps somewhere else because I need to do this one step for a different thing. So this way you can mix and match between different components and different templates, knowing very, very little and still building something that works because you're taking these Legos that somebody has already built into a part of a solution and combining them together into a complete solution. But now let's dive into our actual workflow. We've definitely given enough intro to this. So the workflow is gonna split into three sessions. We have the trigger right here and then we have the whole process, as you can see, starting from here. And then this is the end, like the end result is gonna have the record created on your Airtable or whatever database storing you use. For me, I use Airtable because it's very straightforward and easy for beginner. So basically these are the two Appify actor. So this Appify actor is served to scrape the LinkedIn posts for you. Like the script, the LinkedIn reaction for you is basically like a little special key that go in, like unlock the door of LinkedIn and like, okay, this is post reaction. Let's just unlock it, get everything else, like a little tip, because well, we all know how LinkedIn be very particular with people, with automation and scraping. So they are like a little tip, unfortunately. And there's two actor I'm using. So the first one is, okay, I have my post with 79 reactions. How would I able to get the profile of the 79, like those 79 people? And here is the one that I use. Oh, sorry, this is the first one. So this is, yeah, reaction. Oh, sorry, it should be this one. My apology. So this one is the LinkedIn port reaction scraper. So this is your Appify console. And I'm just gonna show you two things that you just need to look into. So first one- Just a second, just one second. To those of you who don't know, Appify is a platform and you can get multiple different tools that are all API based, that you can use in automations and applications that you're building and so on. And they are geared to do things that are very, very specific. So you go into Appify and then you can find, again, many of these different things. This particular one knows how to scrape LinkedIn for post reactions. Yeah, thank you so much. Yeah, Appify is, it's have a lot of different API, like endpoint exists. So you can even find like, okay, Google Map and vendors or like Instagram followers, all those sort of things. So please, and they have a lot of actors like serve specific need. But for this one, we have the, we're focusing on the LinkedIn and we have the, first of all, the LinkedIn post reaction scraper. So we scrape the reaction of that particular LinkedIn post. The information that you probably need to see is that the, first of all, the input parameter. So basically we need to send this parameter to the endpoint. So basically you go to input and you will see here, it's had exist. And this information for post URL, you can find it in here. So when you go to this link, this particular number is here, is the unique number for this LinkedIn post. So you add it there. So this Appify will start extracting the reaction from that, from this one. So again, for those of you who don't see, it's the end of the URL of your post, right? So those of you who are not watching the screen because you're listening to the podcast as you're driving, it's the end of the URL. There's like a long number character kind of like thing. You grab that. That's the, what Appify needs to know what post to scrape. Yeah, perfect. And so basically this is, I'm just caught. So this is, so, okay. In edit and you have a lot of native node and all the native node I using in here is, you know, you request for an HTTP request. You have a wait note if, so this is our native node with an NNN. And in the first one we're gonna use is the HTTP request. So this is the endpoint or the URL to call, to call this Appify actor. So you can see right here, it's actually the, it's actually the link. Yeah, link to this endpoint. And then after that, you have your actual token. So when you opened an Appify account, you will have your unique token and you just add it at the end. So that means it's gonna use a credit. It's gonna use a credit of your account. And then as you can see here, I put money here. So you want to use a credit of your account to start using this actor. Does that make sense? Yeah. And then from here, we're gonna send, as you can see here, we send the, we add the port URL here and then the page number is one. So going back to this actor, the information that you would like to add in, as you can see here, is page number. So page one, return one to 100, so a reaction. So that means if you have a post at under 100, you just need to post at one or by default, then this will be the page number one. But if you have a very viral post, you have like 500 people interact with it, then you have to do like page three and four and so on. But to keep it simple, it's gonna be one and it's gonna extract all the 79 reaction from this LinkedIn post through here, as you can see. There's an interesting question on LinkedIn. Do these tools comply with the LinkedIn user agreement, basically what LinkedIn allows you to do from their perspective? I don't, I'm not sure about that 100%. I just know that from my knowledge is whenever you go to Appify and whatever with LinkedIn, specific scraping actor sort of thing, try to use no cookie. So if you use no cookie, that means like whatever action, the action I'm doing right now, which is scraping the reaction from my LinkedIn post, right, is from the person who built the actor itself, not from me, because I didn't associate my LinkedIn account with this scraping process. It's just an API call from another thing that the other person built that my LinkedIn profile didn't attach to it. Yes, so- So there's no cookie associated, there's nothing to kind of like track my behavior, does that make sense? So that is why- What I will say is when it comes to LinkedIn automation, there are two aspects. One is what you can get off LinkedIn, and it is very hard for LinkedIn to know who is actually doing it. Like Ana just said, there's no way for them to know who is actually scraping LinkedIn. Like they can try to block it, but they can't know it's you because it's coming in from a specific application. The other aspect of LinkedIn automation is if you want it to post on your behalf on a regular basis, and this is where you might get in trouble, right? If you're building automations to engage on your behalf automatically, this is where you might get in trouble because then it's you, it's engaging on your behalf. And then usually the best practices on these tools is just to use them at whatever levels they tell you that they checked and that are okay. It's still against LinkedIn's rules and regulations. Each and every one of these tools will tell you, okay, if you do no more than 30 a day, you'll probably be fine. And then you can decide what your level of risk to decide how much you want to push that envelope. And I recently, I'm on a call with Chris Chaga, I probably butcher his name, but basically he's like an expert in the Clay Bootcamp when it's come to LinkedIn Outreach. And he told me from the very beginning, there will be a risk. There's always gonna be a risk when it comes to LinkedIn Outreach. It's not really something that LinkedIn promotes, it's not something that LinkedIn want people to do, like they'll sort of automation or interrupt on your behalf, all those sort of things. So there will be a risk if you decided to do a sort of LinkedIn Outreach campaign, but that will be on your own risk and you probably have to accept that. For this one, I'm gonna go back to what Esau just mentioned, is an application from another person, et cetera. So I guess I was just being like hidden, I hope. Yeah, okay, so the first step, we used Appify to scrape the people who actually engaged with the prompt. What happens then? What would that engage with the post? Sorry. Engage with the post. And after that, we have like 77 items coming in. It's like, oh my God, it's too much, it's too overwhelmed. So that is why we have this one, is a loop over item, which is another native loop, sorry, another native node in NNN. It's basically split the thing in batches. So it's break down the 77 item to just do this thing one item at a time. So that means one item's gonna run through this whole process. And when it's done, then the second coming in and the third coming in and the fourth coming in. So it's basically try to avoid over dry, like overflow, over flooding, sorry, over flood the system and to have, and also let each person have time to, sorry, each of the item have time to actually process and go through whatever you desire the outcome. And yeah, so here is the first item coming in. I always recommend people to try this set node. So this is another native node in NNN. It's just basically to clean your data. So whenever you, how to say this, the more I work with this data and the more I work with scraping, I realized that you have to start cleaning your data from the very beginning. So that from to the very end, it's not gonna get you a lot of headache. The same thing when you start building your play table, always make sure your data is clean. So I guess like for everybody who enjoy data and then data analysis or data strategies, you probably agree with what I'm saying. It's hard to clean data. Let's explain in simple English what that means, cleaning the data in this particular use case. So we got scraped information from LinkedIn. We're now sending the first item, right? Because we're gonna do this one by one. So the information that comes is profiling, first name, last name, company, everything it knows that it can pick from that. What does the data cleaner do? Yeah, so this basically is just to split up, how to say this, split up the, as you can see here, the data, there's a lot of data here. They have like the profile picture, like everything like that. We don't need that. You only need what's important. So this is just basically clean it out and able to just give you what you desire. So I just want, so profile URL, I just want to get the, okay. Had to explain it a little bit. We have the LinkedIn URL, which is your own LinkedIn, your own profile LinkedIn. Profile. Yeah, profile link. And then you have the URL, which is a unique number associated with your account, with your LinkedIn account. So the URL can be changed. Like you see, like with my name, Anna Bui something, something, maybe I can change this to a different name. So the URL can change, but the URN will never change. So that is, this is like a primary data if you want to do some sort of lookup. So it's because it's reliable. It's reliable data, the URN. Okay, going back. I just want a job title. I just want a LinkedIn URL. I just want this and the name. That's all I need for the next iteration of scraping to use, which is just a very clean, straightforward data. So it's not going to mess up the system or like mess up the sequence of scraping because now it's like, okay, I'm not going to give you all this crazy thing. I'm just going to give you this full, straightforward information right here. Because. So it does two things. A, it filters the lots of data you got into just the little data you need. And the second thing, it picks what particular type of data you're going to get. So like you said, the URL may not be the right thing. The URN is the right thing. So that's what I want to transfer forward as far as the ID of the person. So this is basically as the first step of everything else, because this will set the stage to, these are the attributes that I'm actually going to use moving forward in my automation. Awesome. Perfect. Great explanation. Next step. Yeah. I just want to add one more thing. The reason why I had to clean it from the very beginning, because all these other nodes can actually use this for reference. So it's going to be like a very reliable data set. And then after that, I'm checking if there are any duplications in my table. So as you can see right now, it's blank. So I'm just going to check, okay, if it is actually blank. So this means that if there's no data, I'll say, this is data. So that means the data length. So there's like information here is true. Then it means it's going to start creating, it's going to start with all this increment. And then if it already exists, then it's nothing to do. But basically, just to summarize, this is a check node. It's just basically checking if this person already on my database. If this person haven't, then okay, let's start with the next one. Yeah, makes sense. So now it's checked that, okay, this person haven't exist. That mean it's going to create a new record. And on this new record, it's going to have, it's going to add into all this field on Airtable here. It's going to start mapping it out. Right now it's nothing, but when it's run, you will actually see the information coming in. And this is Airtable node that with NNN, it's integrated in NNN. So it's very straightforward. You just kind of just map things out. And as you can see here with the URN, I just put it like this and it's referencing the node. That's why you clean the data. So again, for those of you who are not watching and don't know Airtable, Airtable is like Excel with a better user interface. If you want, it's just a database where you can define rows and columns and you can put data in them. And to connect the two together, so now we have four pieces of information on each individual that we scraped. You basically connect the Airtable node and that will bring into NNN the four column headers that already exist in Airtable. You can do the same exact thing with Excel. And then all you have to do is drag the parameters that you brought from the scraper and drag them into each and every one of the columns. So you drag the name to the name, the ID to the ID, the link to the link, whatever it is, and that's it. And that's how you set up. And now what's going to happen is as this first data comes in, it is going to check, does that data already exist? It's probably going to check based on the URN that we now know that's the unique identifier of the person. If that does not exist, it will create a new line in the database or in this case on the Airtable table. By the way, I'll say one more thing. You say, okay, why do I even need to check? Why do you care if there's duplicates? There's two reasons. A, duplicates are never a good thing when you're creating a database. But the other thing is some of the few steps will actually send it to different large language models, which means you're going to be paying for tokens. And if you can minimize the amount of tokens you're paying, then you're just going to save yourself money and time. Oh yeah, definitely. I've been there, done that. I was scraping one person multiple times because I was testing it out. Highly recommend you have something to check to look up your data so you don't waste it. Okay, now for the fun part. The reason why I have people asking me, okay, now you have this person. So from the previous Appify actor, it gave us this person profile ID and then the website, the URL, right? Now we actually need to enrich that. Now we actually have to scrape this person's personal LinkedIn profile. Hence, we will move to this, sorry, this actor right here. So there's two actors used in this workflow. The first one is to scrape the reaction of the post and the second one to actually enrich the individual profile that interact with your LinkedIn. Does that make sense? Yeah, yeah, yeah. So the, what we got before, because what we scraped is the post, is only the information that appears on the post. Now we wanna go to the actual person's profile and then we can pull whatever we want. We can see their company name, their company size, their industry they're in, what they wrote about themselves, their title, where they worked before, all this stuff that appears on profile, you can bring in by now scraping their profile, which is the current step. Yeah, and for this one, for this actor, all you need to do is the profile URL. And I have it here. I just add it from this clean data note, add it in the actor and the actor is running and it gave out all the information. So the previous one just gave us like very basic information like the name, the URL, the profile, all that sort of thing. But this one, as you mentioned, is giving like your full name and the headline and the experience, the company that they work in. Basically it scraped this whole person's LinkedIn profile. And from that, it's just gonna, so from all this information, it's kind of like spit out as one item. So I want them to combine together, combine the full names, stuff like that, all the information that I need, this aggregate note, quite straightforward, just basically just merge all this information, like all this long list into like one cleaner list. And then from that, we're gonna have our AI classify if this person is. actually part of our ICP. So based on the information that we scrape from this person's profile, we're gonna let the AI know like, okay, based on the prompt that I gave you, as you can see here, you are an AI classifier that determines if this person is the ideal, is the ICP or not. So I add in my ICP definition, like the people that I, the roles that need automation. Basically, I paste my ICP here, and I also gave it an output format. Is this ICP? Yes or no. This is the reason why this person is an ICP. And it just needs to give out those information based on the incoming data that from the scraping itself. But this is more like into the prompting engineer, and everybody has a different ICP. So always make sure that you customize it to suit your need, to be able to give out the correct information. Quick help for people to test this out. So if you run this here on the automation, every time you run this, you're gonna pay for tokens. It's very, very cheap. You can probably run this a thousand times, and it's gonna cost you a dollar. But if you wanna make it completely free, use any of your existing AI tools, or the one that you're actually gonna use for this. So if you're using Chachapiti, use Chachapiti, Cloud Gemini, whichever you're gonna use for the API, go to that tool, and then give it the information from that person. Take a screenshot from LinkedIn, drop it in there, and use your prompt that you're gonna use in this automation here, and just test it out. And just keep on improving the prompt until you get the proper answer every single time, and then start running it through the automation, because then testing it is gonna cost you nothing. Yeah, and one of the sessions that I have at Clay Bootcamp from Spencer Tahil, he is excellent, and he was breaking down to us how to actually engineer this prompt. So at first I was like, oh, AI prompting, easy. Not a big deal. The way that he break it down helped me a lot with this project, because you're not just gonna give the AI instruction, you're gonna give it knowledge as well about your ICP, and not to mention you have to give out a required format that it need to follow, an example that it will need to see and reference. So the information that you're giving into AI, the more structure and the more detailed information, go to the AI, the better the result. And as you can see here, I ask it to like, just give me two output, which is, is this an ICP, true or fails? And the reason why you think this person is not an ICP. That's just it. Very simple. Awesome. And then lastly, is that I'm gonna have this, okay, now that I have the information of this person, this person is, first of all, couldn't find this person email address, which could happen because sometimes a scraper unable to scrape the email address, that's some, you know, either hit or missed. Sometimes it's able to find it, sometimes not, but basically ICP is not an ICP, then you have the reason. So right now we just need to update the record that we create from the very beginning with the reasoning, and then the checker, if this is an ICP person or not. And that was the whole flow. Now all is coming, coming back to the loop node, and then the loop going to do iteration and iteration until it scrape out all the, it did to all the 77 items. Awesome. Okay. First of all, incredible, really, really powerful capability that again, we will provide the link to this in the show notes. So anybody who wants access to this, Anna has, the way I found her, is because I saw her sharing this, this thing on LinkedIn. I'm like, Oh my God, this is awesome. I want everybody to know. And so that's why she's on the show. So she was gracious enough to share this with the world. And we're going to share this on the show notes of this episode as well. So you can get access to it and copy it. And like Anna said, not just copy it, but also learn from it. So understand what each and every one of the components. So what you can do, is you can actually open this on your computer and watch the video as you're looking at the actual thing and dive into each and every one of the components so you can understand what it does. And that was the main reason why I wanted to dive into this and show you step-by-step. I want to touch on a couple of more things. And then I'll go into a few of the questions that are coming from LinkedIn. But aspect number one that I wanted to touch on is if you are still working on your prompt for the agent that classifies whether it's a potential customer or not, right? Does it, is it aligned with your ICP or not? One of the things that you can do, and I have a very similar process myself, and I have a maybe option. So when it's not sure, it's spinning out a maybe. And that enables the AI to basically be okay with not being sure. And I want it not to be 100% sure when it's not 100% sure because otherwise I may miss a client, it decides that it's not a client. And so what I do with my maybes on my table is I go and manually inspect them, right? And so that gives you the opportunity not to lose people that actually might be good potential clients for you. And so every time it's not 100% sure, I allow it to say, well, I'm not 100% sure. And then I can go and check myself. So that's number one. The second thing that I want to ask you, Anna, and I know the answer, but I want you to explain in theory because we don't have it in front of us, what's the next step? So let's say I have all of them. How can I create an engaging outreach to them that will actually capture their attention and will allow me to start a conversation with them? Okay, that is a great question. So at first, let me just show you how this workflow, let's just test it to see how it's run. So right now we see that it's blank, right? Yeah. After I run this whole workflow, as you can see here. So for those of you not watching, you can kind of see the thing running. It shows you what step it's doing right now. And now we have the first field or the first line populated on Airtable. And it's going to keep on doing this. I think it goes through the entire 76. And for each and every one, we will initially see just the basic information coming in from the first step. And then we're going to see the enrichment if it is the ICP. It's because that's how the automation runs. So I think just to answer your question is the way to build the outreach is, how to say this? This is where Clay come to play. So you have all these people profile URL, right? Is this correct? You have all this information. So with Clay, I mean, you can use other tool as well, but I have to dig in a bit more. But what with Clay is that you have Claygen and all of this information that you're able to scrape from this person is going to Claygen able to enrich this person data also based on your URL as well. And then from that, you're going to use the Claygen, which is like an AI agent in Clay to actually write out the post for you, to write out the outreach for you, because it's going to base on all the personalized parameter of this person. So for example, this person is, the title is AI automation and developer. Maybe it will write something about like, hey, Swabo, I saw that you're also an AI and automation developer, and then it starts writing it out. And that is where Clay come to play because the reason why I recommend Clay for that is it's able to interact with so many different variables, like different specific variable, personalized variable of this person, and it can create this outreach. One thing that I recommend is that you should have a draft outreach that you would like the AI to follow. Hey, make sure you use the correct name or like, this should be an example email that you will use, like this one email that you find is very good enough for outreach. You let the AIs see that example, and then from that, it's going to interact with different variables that it was able to scrape from Clay itself, that able to enrich from this person information, which is a profile URL, to use that to write out the outreach within Clay itself. And then from that, Clay can actually send the outreach to, it can send it to, it can send it back to end-to-end, or you can send it to Landless or Smartly or something like that. So that is like the different process processed. If you want to combine, again, just one more thing, if you want to combine end-to-end Clay, and is that, let end-to-end do the things to save the credit from Clay. For example, as you mentioned, it's going to cost you like two or $3 to scrape 1000 people, to get the profile URL from that. You export that as CSV, upload it to Clay as a new table, and then use enrichment function of Clay to actually extract the profile URL with even more personal information. And then write the email from there. And that when you see the email outreach from that person within Clay, shoot it out to somewhere else. So that's how you kind of have to be strategic of how you would like to have your outreach very highly personalized. And with Clay, the more narrow and filtered your list, the better. That means like the list that you're going to put into Clay has to be 100% something that you would like to target, the list that you absolutely want to nail. Then you're going to use Clay, and Clay going to exceed with that, with the outreach and writing very personalized email. Okay. Awesome. I will say one more thing. I think you can still do it here, because all you have to do is build a second agent, a second AI agent in your end-to-end process that does exactly the same thing, right? You can give it all the information from the previous steps. You can say, here's the name, here's the title. Here's what they wrote about themselves in the profile. Here's the size of the company, like all the stuff that you have. Here's a template of the email that I want you to use, or the template of the message that I want you to send to base it on. And then I want you to personalize that based on all the information that we've collected in the previous steps. And it will do the same thing. And then you can put the output in Airtable. So you can have a column in Airtable that would be the recommended thing to send. I agree with you that it is potentially easier to do it in Clay. And like I said, Clay can then connect to a gazillion other places that can continue the process. I'm not taking anything away from Clay. I think it's an awesome tool. And like I said, it's not necessarily a bad idea to combine the two together to get the best of both worlds. A lot of people are saying thank you, and this is great, and this has been awesome, and that they really like the flow. And I agree. I think this is, first of all, it's an awesome flow, but it's also, you did a very good job in explaining how it works. If people want to learn more about you, learn from you, work with you, take your courses, hire you to build automations for them, what are the best ways to do that? Please feel free to reach out to me on LinkedIn. And from that, you will see other resources. And I will keep posting things constantly. And maybe again, maybe on the next episode, hopefully, I can tell you, actually build out the AI agent that can write the email for you. I was just thinking about scalability. You know, with anything, you're good, but like when you start scaling things, you probably want something more robust. But yeah, thank you so much for having me. And I'm so glad that everybody in, you know, hopefully you enjoyed the session and found it helpful. Thank you so much. This was absolutely fantastic. Thanks everybody for joining us. Thanks for being very active. There's a lot of chat happening on LinkedIn and on Zoom and people asking great questions. I didn't bring all of them up. I answered some of them myself if I knew the answers, but thanks everybody for joining us. Again, I know you have other stuff to do. Come join us next week, every Thursday, 1 p.m. Eastern. We're gonna do something like this with another amazing expert like Anna. And thank you, obviously, Anna. This was amazing. First of all, thank you for working on this and thank you even more for coming and sharing it with us. Thank you so much. Thank you.
Key Points:
The Leveraging AI podcast focuses on practical ethical ways to use AI for business growth and career advancement.
The episode discusses the importance of clients for a business and the need for leads to acquire clients.
Anna Bui, the guest, introduces an automation process using AI tools and NA10 to generate leads effectively.
Summary:
The Leveraging AI podcast, hosted by Issar Metis, emphasizes practical and ethical utilization of AI for business and career enhancement. In a recent episode, the significance of clients for a business is highlighted, stressing the necessity of leads to acquire clients. Anna Bui, the guest expert, presents an automation process that combines AI tools with NA10 to efficiently generate leads. The discussion delves into the importance of nurturing leads from platforms like LinkedIn and creating personalized outreach messages to drive engagement and conversions. Anna's approach involves using NA10 as an initial tool before progressing to more advanced platforms like Clay, due to its cost-effectiveness and flexibility. The episode also touches on Appify's scraping capabilities for LinkedIn data and emphasizes the value of community support and template usage in getting started with automation tools like NA10.
Chat with AI
Ask up to 5 questions based on this transcript.
No messages yet. Ask your first question about the episode.