Introduction to Python For SEO.
Articles Mentioned:
- 17 Pandas Functions to Replace Excel with Python (and be happy forever): https://www.jcchouinard.com/pandas-excel/
- Google Search Console API: https://www.jcchouinard.com/google-search-console-api/
- Post to Facebook Groups: https://www.jcchouinard.com/post-to-groups-using-facebook-graph-api-python/
- How to use Reddit API with Python: https://www.jcchouinard.com/how-to-use-reddit-api-with-python/
- Web Scraping: https://www.jcchouinard.com/web-scraping-with-python-and-requests-html/
- Get Started With Selenium: https://www.jcchouinard.com/learn-selenium-python-seo-automation/
- Create XML Sitemaps: https://www.jcchouinard.com/create-xml-sitemap-with-python/
- Pagespeed API: https://www.jcchouinard.com/pagespeed-api-and-lighthouse-forecasting/
- BERT Score: https://www.jcchouinard.com/get-bert-score-for-seo-by-pierre-rouarch/
- Deploy Flask App to Heroku: https://www.jcchouinard.com/deploy-a-flask-app-on-heroku/
- Flask Mega Tutorial: https://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-i-hello-world
- Python Script Automation (MAC): https://www.jcchouinard.com/python-automation-with-cron-on-mac/
- Python Script Automation (Windows): https://www.jcchouinard.com/python-automation-using-task-scheduler/
- Spyder IDE: https://www.jcchouinard.com/python-with-spyder-ide/
- Jupyter Notebooks: https://www.jcchouinard.com/how-to-use-jupyter-notebook/
- VSCode: https://www.jcchouinard.com/python-with-vscode/
- Install Python: https://www.jcchouinard.com/install-python-with-anaconda-on-windows/
Show More Show Less View Video Transcript
0:00
Thanks. So I'm very excited to be here. Like ever since I've moved to Australia, I wanted to come to this Melbourne SEO meetup
0:09
And here I am right now. So it's fantastic. And I'm also very excited because Python is great
0:15
Frankly, I start to even like it as much as SEO. So it is crazy how great learning a new programming language is
0:26
And that's what I want to share. So let's get some stuff done here
0:32
This is not an SEO automation or slash machine learning talk. I'm not qualified to talk about machine learning
0:41
I don't have the maths necessary. And SEO automation is a long process
0:46
And what I want to stress upon is that you can see massive amount of benefits from Python
0:53
well before you get into SEO automation. and machine learning. I feel like these are trending keywords that people use without
1:02
necessarily knowing how to do it or even what it is. But I do know that you will see benefits
1:12
within like three, four, five weeks of learning Python. You're going to be able to make your work
1:18
easier as an SEO. Not everyone will benefit from the same thing though, because you have small
1:24
website owners that just have their own blogs or their own website. You have agencies and you have
1:30
big sites. So everyone can benefit from it, but you will not learn the same thing. So basically
1:38
small site owner will learn marketing automation with social media automation. They will learn some web development. They will start to build their free SEO tools because
1:51
let's say it without a small website owner don't want to pay for SEO tool and they should not
1:58
Python can leverage free tool because you can build them yourselves agencies on their part
2:06
what they're going to be doing is mostly reporting there will be some kind of automation and they
2:13
also can create scripts that they can reuse from from client to client so this is amazing
2:20
When you get into larger websites like Seekiz or Jora or whatever big website you are
2:29
you will get into data ysis. You will start manipulating data. You can build your own tools tailored to what you need
2:38
not like the tools that were built for everyone, but those you can build specific to your need
2:44
And you will start to leverage APIs. So there is not one single path to learning Python
2:51
In this presentation, I try to focus more on the small site owners because I've wrote a lot about that ysis
3:01
And I think most people can apply this to their own blog
3:06
So this is why it's interesting. So what we're going to talk about is five reasons to learn Python
3:13
And I'm going to give you a few example scripts, a few that I just posted about for this presentation on my blog
3:21
And I also tested some SEO hypotheses with Python. And I'm going to talk to you about that
3:27
And I will in the end show you how you can get started right now
3:31
So it's a very big presentation. Feel free to ask questions. I don't mind
3:37
I speak fast or slow, depending on the people listening to me
3:44
But basically, feel free to stop me. We're just going to make it longer
3:49
And if we don't have time to cover everything, I put links to everything
3:54
So the goal here is, yes, I give you the most resources I can
3:59
it's not on github yet because i posted most of it like a few weeks ago or in the last days i
4:07
posted one article uh yesterday just yes for just for this um so let's get going why should we learn
4:17
python for seo so let's cover the first reason to learn python for seo let's talk about excel
4:24
glories. How much do we like Excel? So this is one. Oh, man, I hate that. I just look at that
4:32
I can't read it. I hate it. I don't want to read it. Even if I would read it, let's say you put
4:37
one error in that. How do you find it? Like, it's disgusting. I hate it. It's really prone to error
4:44
formulas are thousand feet long. We hate that. This is a really bad process to work on. So leave
4:54
that to, I don't know, accountants or whatever, but stop using Excel, please. And if you look at
5:03
the actual report, it's quite amazing, but how can you trust this when you have so many chances
5:09
of getting errors. And this is one example that I just found is that some people have tried to look
5:16
how many spreadsheets actually have errors. And they said that it's 90% of all spreadsheets. So
5:23
even if we're all good, there is a big chance that a few of your spreadsheets that you've built
5:30
actually have errors and you don't trust it. So yeah, this is quite a good reason. Another Excel
5:38
glory. So I'm not sure if you've heard of that little thing. It's called COVID. So a few people
5:44
are talking about it right now. It's really popular and so much that government, tall
5:50
institutions have decided to start tracking it and they've started adding data to it. And some
5:57
people in UK decided that it would be a good idea to use Excel as their database for their COVID
6:05
tracking. So what happened is that the data was not good. That is a big surprise. So it
6:13
made me laugh quite a bit. So let's have some fun. It's not against people, but that's funny
6:22
Another pitfall of Excel. Let's say you have a lot of data. You've made a massive good work
6:29
And this is the reason why I started learning Python. You do that. You extract all the URLs
6:35
on your site, you match them to Google Search Console and to Google ytics, and after
6:39
months of work, you want to show that to your colleagues, and what happened
6:45
You cannot load the file. It sucks. Too bad. Excel is bad
6:50
But let's forget about it. We're going to give you the first rows
6:54
It should be fine, right? So another pitfall, and Excel is all about pitfalls
7:01
They made an entire article about it saying, we have some limits
7:05
Look how long this article is. It's just limits over limits and over limits
7:11
So this is why I hate Excel. And this is why I want to show you what you can do instead of Excel
7:19
So let's say you want to build your own ads, right? You want to go in Google Ads and you want to put all your keywords in the list So we all did that at some point We put some used and new
7:36
And we put the kind of product we're selling, and we're selling if it's for sale and for rent
7:41
And then we do some concatenation of all of this, and you get your list, right
7:48
You could do that in four lines in Python. Elias Debs has built the amazing library that it does so many things for SEO
7:57
So this should be one of your starting points in SEO. Let's see, four lines and you can get your list of keywords really quick
8:07
You just put whatever you're selling and you can do all these match up and it's going to do your keywords list for Google Ads
8:16
and let's look at how to kill excel in two lines so basically what you do you import panda
8:25
and then you read your excel file and you never open it from excel ever again
8:31
so you can read it and you can export it into excel and for people who really want to destroy
8:39
excel for the entire rest of their lifetime all of these are the functions that you need
8:46
in pandas to be able to replicate most of what you do in excel so you read the excel file
8:53
you export to excel if you want to share it with colleagues that are still stuck in this
8:59
marvelous tool and then you can do a lot of those stuff like getting unique counts so just
9:06
go ahead to the website and get all of these functions this is the absolute basics
9:12
but going to Panda, you're going to love it. Cool. Any questions so far
9:21
I'm going fast, I think. I'm sure that there's going to be questions
9:28
I don't see any in the chat at the moment, but I might just ask one right now
9:33
You're using Excel a lot to exemplify all the issues, whether with formulas and file size and limitations
9:43
but the same kind of thing also exists with Google Sheets. This can also be applied, I would imagine, with any Google Sheets
9:53
as long as you change the – you can download the file to a CSV
9:59
or XLSX file as well, I would imagine. Yes, you can do all of this
10:06
Working with Sheets is a little bit more complicated, And Sheets are actually less bad than Excel because they can plug into BigQuery
10:15
You can get as many rows as you want. Dave Sotimano has worked with Apps Script
10:20
shown that you can do amazing stuff in Google Sheets. However, I still prefer Python, and it is better forever
10:28
So let's move on. So yes, you can do all what you said with Google Sheets
10:35
but you still have limitations in Google Sheets. It is a free tool, whereas with a programming language
10:40
you can build your own stuff. And this is just Panda. I just want to tell you Panda is one library out of millions of libraries
10:49
Amazing. Just thinking for my Mac users out there. So, yes, Google Sheets is still good, but, yeah, Python is better
11:02
on my point of view. So the reason number two to love Python and use it for SEO is APIs
11:12
APIs are amazing because there are big blocks of code that people have worked on
11:17
companies have worked on, and they decided to make that open source and available to you or paid
11:23
And you can just call the API and use it for your own sake
11:28
And so you're basically standing on the shoulders of giants. that people have worked on some stuff and you're just using it
11:35
So APIs is amazing. And one of my favorite and the other reason why I learned Python
11:40
is to use Google Search Console API because Google Search Console is notoriously limited
11:46
for good reasons because they cannot store everything and provide that to everyone
11:54
But with the API, I have managed to extract more than 700,000 rows of data per day for a single website
12:03
So you are really well equipped with the API. And for example, if you want to do that report
12:12
showing the number of queries by position, it's easy to do that on a small website
12:19
You can validate subdirectory and get all the queries and all the keywords for that subdirectory
12:27
but it's a painful process with Python. Once you get that, once you connect to the API
12:33
you can extract almost everything from the API and you can do awesome visualization like this one
12:41
Very easy. There's a video to show you how to do that right there
12:47
Another thing I like to do is automatically archive backups in the Wayback Machine
12:55
So you go to Wayback Machine, you want to check one of your URL
13:01
It hasn't been archived in a few months. It's normal. They don't crawl everything
13:06
But you can ask, actually, Wayback Machine to come to your site
13:13
look at all your sitemaps, and archive each of your URL weekly
13:17
So all you need to do is one line, pip install Wayback Machine
13:21
and then you do a cron job, which we'll see later, to read your sitemap every week and archive everything
13:30
So this is another cool stuff. Facebook. You can post to Facebook groups
13:37
This is good. You can make a list of all your favorite groups, Facebook group
13:42
and spam them as much as you want. So this is good, right? You can spam as much as you want
13:48
Frankly, it's really simple. Look how simple it is to post to a Facebook group using Python
13:56
It's only a few lines. If you don't want to spam it, you want to learn
14:00
I created a Facebook group that everyone can just use and share as much crap as they want
14:07
It's good. Just practice, please, please, please. Don't spam the web for real
14:12
Just practice on that Facebook page. I'll be happy to read your funny comments or funny stuff
14:18
and I'll be happy to see that people are learning how to use APIs
14:22
With great power comes great responsibility, right? Yeah, but mostly people will bash you
14:28
if you try to spam the web at some point. Somebody is going to cut off
14:32
Oh, absolutely. We do have a question from Peter Majinkovic in the comments
14:37
if you want to go and get into it. So Peter asks, how would you import multiple CSV files
14:46
but transform the data like in power be yeah so you make a loop so you look so it I did that this with this week so what you do is you open a file So you use Pandas You open the you go into the folder that your files are
15:08
and you use Glob, G-L-O-B package. And with Glob, Glob, you can actually read all the files
15:17
and put them into a list. And then from that list, you make a for loop
15:23
A for loop. So you make a loop, looping through each of the files
15:27
and adding them into a list of data frames with Panda. And from there, you concat
15:34
So you do PD concat your list of data frames, and you're going to have one big list of data frames
15:42
Does Glob, G-L-O-B. Oh, G-L-O-B. Yeah. Yes
15:51
So you would do that. At some point, I can add some information
15:59
I'll put a screenshot into the slideshare I will share onto slideshairs
16:06
Just for you. Thank you so much, dude. So let's, other thing
16:11
What's our job about SEO is to protect our assets, like try to get traffic to the content that we already have
16:19
try to improve the ranking for this content, but also discover new opportunities in order to grow
16:28
Reddit is a fantastic tool to help you understand what people are searching for, what people are talking about
16:35
And Duarte O'Karma actually made a fantastic script showing how you can input any keyword
16:47
and see which subreddits are talking the most about it. So this is very simple
16:52
but it can really help you learn about your industry. The Reddit API is actually probably the best social media API
17:01
It gives you so much access. It lets you post, it lets you a fantastic amount of stuff
17:08
and stay tuned because there are some articles that are coming up on my blog at some point
17:16
about Reddit API going really deep into what you can do. The reason number three to love SEO and Python together
17:28
is homemade tools. So let's face it, most SEO tools are really expensive
17:33
and you can you don't necessarily want everything in those tools so you can start building your own
17:41
the first step is to learn web scraping so unfortunately screaming frog is just too good
17:49
so there is like there is no way you're gonna be able you yeah you could but like it's just a
17:56
fantastic tool and they're growing faster than you're learning so frankly you cannot you won't
18:03
replace it i haven't yet and i don't plan on trying to replacing in fact but uh the thing that
18:11
i think it's still good to learn web scraping is because sometimes it's easier to run the the
18:17
crawling from your script that you have already in python and all have this everything you do in
18:25
the same under the same place but the other reason why it's interesting is because you're
18:30
you learn a lot by trying to build your own crawler. You can learn like how much effort it is
18:40
it goes, how much time it takes to make an HTTP request. You're going to start understanding HTTP headers, how they work
18:48
So I've learned, I think that this is the place where I've learned the most about SEO is to
18:54
understand the technology behind a web crawler like Google. But when you try to build your own crawler, you start understanding what all the complexity that goes behind the tool like Google when they try to crawl your website
19:09
And I really recommend that you go into there. And the other place where I recommend that you should learn is to do automated testing with Selenium
19:18
Because Selenium, what will it do is it will also render the web page because it actually opens a browser
19:25
it navigates through a browser and then once it's rendered you can start testing stuff like
19:32
are your form working and you can automate that every day you're coming you could say
19:37
look at my form and make sure that this cart add to cart button is always working always every
19:44
second and alert me if it's not so this is one very important feature that you can learn from
19:53
python you could create sitemaps i'm like homemade tools you could create your sitemaps without
20:00
paying in a second tool uh most big sites don't like they will have automated sitemap some don't
20:11
and if you have a very big site like it's really easy to build your own sitemap with python so why
20:19
go with an external tool that you need to open the browser and whatever this is fast this is useful
20:25
oh can i ask a question about that um is that is that static or can that be done yes it's
20:32
it's static it could be dynamic if your website is built in python i like i'm not a web developer
20:40
i know a lot but i'm not a web developer if you develop your website in python it's easy to build
20:46
to make a dynamic sitemap. If you try to plug in your Python script
20:54
into the database and the database is written in Ruby, not the database
21:00
but if your website is written in Ruby, for example, it would be better to do that in Ruby instead
21:07
So that's a point. I'm just posting that there because that's awesome
21:13
If you want to do an automated sitemap, talk to the dev please like they are better place it's a very critical thing they need to be
21:22
involved into that if you want a build a site map because they don't have the resource or you just
21:27
don't care to have another automated site map do it yourself it's fine or use a plugin and even
21:35
before if you were on wordpress use a plug and don't don't do it yourself but if you need you
21:40
could. Oh, there are all kinds of reasons why, why sites don't have sitemaps. They'll find some
21:47
creative ways to find reasons. So before I get going, this is one of the things I want to bring
21:57
why I like Python. So I started in R at first, it's another programming language that is more
22:03
oriented towards statistics. And then I got into Python and I loved it and I stuck there. And then
22:10
I got into my new job and I found some guy that is working on Julia. My colleague, Ron
22:15
Erdus, is working with Julia and he loves it because it's really powerful. What I do
22:20
love about Python and why I stuck there is because there is a big community in there A lot of people are doing awesome stuff on the web And in Python in SEO I see some people
22:35
doing our stuff. Most of the people doing Python and almost nobody do other programming language
22:43
JavaScript could be a good solution. There is a lot of resources for JavaScript as well
22:48
But in the end, I'm just saying Python is amazing because people, a lot of people are
22:54
building stuff and you don't have to do everything yourself and this is one fantastic example so
23:01
daniel arimadia i talked to him because he published something that i found was amazing
23:08
and he agreed to post on my own site and it end up that is like my favorite blog post on my old
23:14
website so like this is the power of this community and a lot of people are doing a lot of stuff at a
23:21
very fast paced. So what he did is basically he scrapes Lighthouse and he says, if I improve one
23:30
of the feature, how many points will I get more? And then you get what you should work on first
23:39
So now you can look at any page of your website or any template of your website and you can learn
23:45
okay, you should work on large content for paint before everything else. And your Lighthouse score
23:50
is going to go over the rest. And also, another example of the community
23:58
Yeah, sorry. Sorry, I have questions. Yes, go ahead. Going back to size speeds at large
24:06
and looking at different scripts and things like that, particularly for priority lists
24:11
how does that, from your experience, how does that differ between using Python
24:17
to be able to use the PageSpeed API and Lighthouse forecasting to say something like Screaming Frog
24:24
where you can also use the same APIs to be able to get some really
24:29
really great, I guess, ysis, like snapshots across the site? Screaming Frog is better
24:38
So let me make it clear. Screaming Frog is always better. Not everyone wants to pay for their license
24:46
And also the thing is with Screaming Frog, you don't know how much it's going to cost in the future
24:52
And the other thing is that when you do a Python script
24:57
you could decide to... Screaming Frog is super slow as well. At some point, you get into the database
25:07
and you can get some very big pitfalls with that. But when you do it with Python
25:12
you can split your task over six months if you want. You can send that on the server
25:19
You can, like, there are lots of ways that you can. And you can also automate
25:25
you make it easier automation when you build it with your Python scripts
25:30
But frankly, I told you, like, Screaming Frog will not be replaced soon
25:37
It's just too great. And at the cost they're giving it, it's just a joke
25:41
So yes, that's your answer. yeah thank you very much i'm not a sales representative of screaming frog by the way
25:52
non-spawn so python is fun as well like a lot of stuff it's easier to get lost into scripts and
26:02
just wasting time it feels like waste of time but at least it's fun like this uh pierre wash is a
26:09
a French data scientist that is doing a lot of stuff in a large amount of sectors
26:19
and he's done some SEO stuff as well. And it's really interesting, and I translated that article for people
26:25
because I thought it was just amazing. You just take a subject and look your birth score
26:30
for that subject, which is a funny concept, birth score, I don't want to get into it
26:37
And the thing is, I spent five minutes replicating what he did
26:44
I found it amazing. I decided to write this article and I never used it again
26:49
So this is one good example of how it can be fun, but it's not always useful
26:54
So try to focus on the task that will have actual benefit over time
27:01
and try to do stuff that you will do. If you only do it once, it's not worth it
27:06
Like make sure that everything you do, it needs to be replicated
27:13
So if you're bound to do that two, three, four times, you can use a Python script and never do it again
27:21
So this is my, I prefer spending a little bit more time on doing stuff
27:27
building stuff that will stay there over time. so yes there's so many different use cases for the for the previous one sorry i'm like
27:39
you're you're showing it up on screen um and i'm just kind of like digesting it but there are so
27:44
many different use scores for um you know to be able to get a bird score for seo i would imagine
27:51
um this is maybe where you're looking at on-page optimization to be able to maybe look at uh
27:57
uh, loving opportunity to maybe get a featured snippet? Is that, is that sort of like a common use case for being able to answer those
28:05
questions or just generally just, um, to try and like understand the topic of your article, uh, by
28:13
by doing this, those things, you start to understand natural language processing a little bit better
28:18
which is good. But, um, like I have never used it again
28:24
It's not that it's not possible to use it. It's just, like how likely it is that you'll get better ranking from it yeah uh i'm like batista and
28:35
mike king has made a discussion around uh birth and around the the dependent of facebook that
28:41
they did as well and it was really interesting and it's yes so this is the part of machine learning
28:49
where we're not data scientists we're seos and once you start trying to yze data stuff that
28:57
are used machine learning you are bound to make the wrong observation if you don't know what's
29:02
going on underneath so it like it's fun maybe it's not the first step like there are a lot of
29:09
steps i showed this because it's really interesting it's just there are lots of steps that will be
29:16
more useful to you than this yeah doing a bird score for your site but so i guess um if if i can
29:26
kind of like just kind of take what you just said in summarizing you can tell me if i'm wrong um but
29:31
having like a birth score that um you can look at maybe across your entire sites you can get um
29:38
like kind of a bit of a snapshot as to how um how each of those pages um are seen by google
29:45
and understood based on the sentiment ysis um based on on what the the you know what content
29:53
you've got on each of those pages, and whether or not they sort of match
29:59
they match what you're trying to go after and rank for. Am I correct in that
30:09
What he did is he scraped search engine results and yzed whether the query you're matching
30:21
matched with what he reads in the search. So if you search for something on Google
30:26
He tried to understand what was the text in the search engine, the top 10 SERP
30:34
And he tried to do a – he yzed what was the subject around that
30:41
and gave your page a birth score. Is it matching to what is in Google
30:46
So, yes, it is useful. It is just a little bit further down the track
30:55
there are usually more optimization to do before that yeah but at the same time i know why you're
31:03
excited it's really fun i like i'm pretty sure that you're gonna go and try to do it uh hopefully
31:09
and it's fun that's what i did and this is why i wrote about and yeah yeah so it's um i don't want
31:18
to talk too much about machine learning, I don't think I am qualified to go too much into that
31:26
No, but it is really fascinating to think that we can leverage Python to be able to
31:32
you know, essentially assign a score from what we can understand, for what, you know
31:40
smarter people than myself, especially, has been able to, like, figure out from the BERT algorithm
31:48
Yeah, it is a good idea. It's just that I think it's easier to yze entities
31:58
Entities, you can do that with the Knowledge Graph API that Greg Bernard has actually wrote on my own blog as well
32:07
The Knowledge Graph API, he did. So this is good because it splits your article into entities
32:14
And when you look, Google understands the web into entities and understanding your own entities let you evaluate
32:24
if your content is actually telling what you want it to be telling
32:30
So that would be a better step. Awesome. Thank you. So Streamlit, Peter, I'm coming up on that a little bit later
32:39
but I don't know. I have never used Streamlit yet, but I saw the presentation of Charlie
32:46
It's coming up. Yes. So yes, it is good. So this is exactly going in the vein of where I'm going
32:56
Testing hypotheses. Streamlit, what it is, is a tool that you can build web app
33:00
and make them public for free. So this is where I'm going right now
33:06
That's what I did. I didn't do it with Streamlit. I did it with Flask and Heroku, but I built a small testing site to test SEO hypotheses
33:15
So you can do that with Python. You can build an app, and instead of waiting for your developers to do something and try something
33:22
you can actually test stuff in Google Search Console, see how it reacts
33:27
and then once you know what's going on, then you can go to your devs and say
33:32
look, it worked. We need to do it. So here are a few examples of hypotheses that I've tested
33:41
So I've made a website and I said, what happens if I time out the website
33:46
So I discovered that Google will wait for actually 30 seconds and after that, it's a server error
33:52
So I just used the live testing tool and now I can see that straight away
33:57
How long is too long for redirect? So if you redirect to a timeout, it's still 30 seconds
34:03
If you want, how many redirects is too many? Now the answer is six
34:09
At six redirects, you get a redirect error. A lot of people already know that
34:14
because I discovered a few days after doing that that John Mueller had actually given that answer
34:22
in a hangout session and they're recrawling those URLs later. But the thing is you don't need for John Mueller
34:29
you don't need to wait for John Mueller to give you all the answers. you can try stuff yourself and try to see how Google Search Console is replying you when you
34:40
do stuff. So I had a lot of fun doing this. I made a page to see how long Google would wait for
34:49
JavaScript. So I made a page that was just counting seconds. After four seconds, they didn't render
34:55
anything. I was wondering how long it would wait for the redirect. The answer is still for a second
35:03
because basically they're just waiting for seconds for JavaScript. There are probably other rules
35:08
but when you make it at the most simple way, you can start understanding all these small components
35:13
and this is one of the small components. How many character is not enough. So if you take that page
35:22
with all this content, and you just decide that you take five characters like this from the text
35:29
and you look into the test, live test URL in Google Search Console, what happened? You discover that
35:36
under 28 characters, you get a sub 404. So you need at least 28 characters on your page. Most
35:43
web pages have, but this is one rule that is probably added by Google to just identify if
35:49
your web page is actually a web page. If not, they send us a 404
35:55
So that's interesting. You can also do it yourself on your own site
36:02
So here I made a mirror of my own site. I didn't post that
36:06
I just made a mirror of my own site and I test the same idea that I just showed you there
36:12
And I just said, what if I had only one character in my article
36:16
I don't get that 404. So it might look like redundant, but this is the kind of stuff that you can do with Python
36:26
You can start testing a lot of idea. I did all that in one day, by the way
36:32
Just it took one day to do all of this. So it just gives you the scope of what you can learn
36:39
by starting doing stuff yourself instead of trying to make everything public or, yeah, it's good
36:45
So how you do exactly what I just showed you, you need to build a Flask app
36:51
So Flask, Streamlit, you could use Streamlit, Peter. It's okay. I have never tried it
36:57
I thought the presentation was great. But I use Flask because I think it's really easy to use
37:05
Then what you do, you add Google ytics to the root or your homepage of the Flask app
37:10
And then you deploy it for free on Errol Kuru, which all the documentation is there
37:17
Then you validate your app in Google Search Console, and then you can do whatever test you want
37:22
You can make it public. You can actually do a 100 mirror of your own website and just test whatever you want So it really interesting If you don know about Flask go to this mega tutorial which is not mine but it is really really complete
37:39
Cool. The reason number five to love Python for SEO is automation
37:46
So I told you I would not talk about automation, but it turns out I do in one slide
37:52
If you want to... Hooray! you can take a script that you've written and just automate it by asking your like you could
38:02
do that on a server but if you want to do it simple today tomorrow on your own website on your
38:08
your own computer you could do either use task scheduler on windows or you could use chrome on
38:15
mac so what you do you're saying every day every week every year or every second run this python
38:22
script so just look at these two article and you're gonna be able to do it yourself
38:32
so fantastic so what are the next steps so are there any other questions
38:41
no we're good so i know it's a long presentation hopefully it's it's coming soon so the next step
38:50
You need to understand how to run a Python code. So you have two categories
38:55
You have notebooks and you have code editors. Notebooks are used to write code as a blogger
39:00
For example, you can document more. You can run line by line by play
39:06
pushing on the little play, and then you run the code and you see it
39:10
So it's really useful to learn Python. It's easier to learn Python this way
39:16
It's easier to share it with people that they don't need to install Python to run it
39:22
However, as you become more using more Python, you're going to move into a code editor
39:29
because you're going to start to write code as a developer. You're going to start sending that to GitHub
39:35
So I think for the daily work, it's better to use a code editor
39:41
I don't use at all notebooks anymore, but you could. They're good
39:47
they're visually pleasing and really easy to share and I don't like Elias Dabbas told me to use he's way above me in Python and he told me to use
40:02
notebooks because it's easier after all to write a blog post using it so yes so it's you use it as
40:10
blogger the next step is to install python this was the most painful part i did like it's actually
40:17
the hardest part in python is to install and you run your for your first python script
40:25
yeah well the first time i tried to install it i installed it on the server uh internal server of
40:32
my company because my account was on an internal server and i couldn't use python at all from the
40:37
common line so yes it's complex there are spats involved there but don't give up like it's the
40:44
first time that you're gonna give up with python is when you try to install it at some point you
40:49
need to take three days and just man i gotta install it once you install it it's not gonna
40:56
take three days every time it's it's not that complex it's just it is it seems complex you go
41:03
you install Python with Anaconda. What it will do is it will install a lot of package
41:08
that you will need in the future. It will install tools like Spider ID, RStudios
41:15
and Jupyter, and stuff like this. You don't need anything of this
41:19
You just install later on VS Code. You use VS Code, but Anaconda will install
41:25
everything you need to run Python. So don't get too confused. just do it
41:34
and after that you don't need to go into the Anaconda Navigator
41:38
you just need to run Python like I will show you so the next step if you want
41:44
but this is my favorite text editor is VS Code what you do is you install VS Code
41:51
and then you install the Python extension and then you need to learn how to run it
42:02
So this is really for the beginner, but it's all right. I understand the beginner is the hardest part
42:10
So try to do exactly what I showed you. I create a script that is called simple.py
42:17
which runs a hello world message. It prints it to the console
42:25
The readme and the gitignore are only common GitHub file. They could not be there and it would change nothing
42:32
So just focus on the simple.py. You can run that code that you see here in two ways from VS Code
42:40
Either you use the terminal or you use the Python interpreter. If you want to use the terminal, you can open it straight from VS Code
42:49
And then you use the Python command plus the name of your script
42:53
and then it will run hello world like we see here so it's not that hard it can it can look
43:01
complex but it's not that hard once you use vs code it is really easier to because that since
43:08
the terminal is inside vs code it's easier to never switch pages and always be in the same place
43:17
The other way you can run Python code is using the Python interpreter
43:22
which is basically running line by line. So you need to select the lines that you want to run
43:28
and then you press Shift-Return or Shift-Enter, and you run the code line by line
43:36
which in the beginning, if you use VS Code, you're going to use that one a lot more than the terminal
43:44
and over time you're going to move a little bit more too but i still use both of them
43:49
do you need to do you need to use both of them or like is there no why do you use two
43:56
because once you do plots where from terminal you need to save the plots and because they will not
44:06
show if you want to show a graph they will not show in the terminal if you do this in the python
44:12
interpreter, they will show because this is kind of your notebook on the side. So do you need to
44:20
use both? No, it's just because when you just want to run one line of your script, it's easier to use
44:27
the Python interpreter. When you want to run all the script, you use a terminal
44:34
Okay. So the next steps, once you have installed Python, you have created your first script and run it on
44:42
VS Code, I suggest you take an online course. My website is the best place to go first, go to
44:49
install Anaconda And once you done that you go ahead and you take an online course My blog is not the first point you should and nor is Coursera or Udemy I think it really hard to learn from those stuff
45:05
DataCamps is like so easy. Like you have the interpreter in front of you
45:10
You have someone showing you and you can test it straight away
45:15
So you can learn really fast in a few weeks. You can learn Python and it will give you the basics
45:21
that you will need for a long time after that. You'll still need what you learn from there
45:27
So even just learning how to do maths or stuff like this
45:32
it's really interesting to go and master the basics before you get into building projects
45:40
And then you learn pandas. You just learned pandas. It is your best friend
45:46
So Julia Choban, I don't know if it's the way we spell her name
45:50
but she's yzed Creamy Frog data using Panda, and you can learn a lot from there
45:57
Once you've done your first course, you can start learning more about Panda
46:01
and this is the first gate where you should go. And then you have other Panda stuff
46:08
and you should focus like ditch Excel or even Google Sheet and just focus on Python
46:15
And if you force yourself not to use Excel or Sheets, you're going to learn Python and pandas very fast
46:23
and you're going to love it. Hamlet Batista calls it Excel on steroids
46:29
but he's wrong because no matter how much steroids you give Excel, it will still be a long and painful and very error-prone tool
46:37
So give up Excel for the love of it. Is there any particular reason why it's called Panda
46:47
I just ask because I think – That's a good question. SEOs are like Google Pando, like algorithm update
46:56
You know, it's... I don't know. Probably pandas became before the algorithm update
47:01
I don't know. It's because it's a lovely animal probably. I don't know at all
47:06
Well, panda does assess the quality of your content. Maybe this is like maybe the quality of the code
47:13
I don't know. I would not pronounce on that. I have no idea
47:19
I've never even asked myself so it's good so you can find the answer
47:26
post it on Twitter and I'm going to retweet it and like it
47:31
and even put a meme or something on it if you will
47:35
I'm all for that so you can go the extraordinary so what I love the most about learning Python
47:42
is it gave me a lot more leverage to understand what web developers
47:48
were doing. Even if I had a good idea, you start to speak the same languages as them
47:54
If you want to go the extra length, you should go and start learning GitHub. Start with Python
48:00
forget about this slide for a while, come back to it in a few months, learn GitHub
48:05
commit to GitHub because you can save and you can keep a trace of what you're doing
48:10
but you can also use the same tool as your web developers are using. And once you start doing
48:16
that you can go into the code yourself later and you can start learning a little bit more
48:21
about what they do, when they do it, did it. So yeah, this is the absolute big step
48:27
to do some hour later. And I build a guide telling you
48:31
exactly what you should do to learn it. Let's celebrate other people
48:39
Like the community is awesome. And I should start with this guy
48:44
So it's Elias Debus who's built the advert tools. It's the biggest SEO library
48:50
In fact, it's the only big SEO library. You should definitely play with it, read it, understand it
48:59
It has some stuff like this example. You can read your sitemap, but you can also use the Twitter API
49:07
You can check your robot TXT. So there are lots of stuff you can do
49:14
And he's really prolific, and this is the best library you could learn after Pandas
49:25
Corey brought some very good things as well. Is Google bot Google bot
49:31
So you can actually take all your logs. look make you could basically use a url look at a log line and make a reverse dns lookup
49:43
from terminal if you want but this guy has made a python script that does that for all the url you
49:49
give it so you could look at your logs and see all of those lines are they really google bot so this
49:55
is fantastic work uh thanks to greg bernhardt as well so i have announced myself as a fan of
50:03
Screaming Frog. So this work is really important. How do you automate Screaming Frog with Python
50:10
So this is a must-go destination. And Streamlit. So we cannot have a presentation without
50:19
bringing in the kings. Amlet Batista and Charlie Wang. Yes, Charlie has talked about how to build
50:28
the Streamlit app. I thought it was really interesting. I still prefer Flask, but I haven't done Streamlit
50:37
so I have no idea if Streamlit is better. So choose one, get good at one, and you're going to..
50:45
All of these stuff are doing basically the same thing. So you just choose one, you get good at it
50:51
and then you're going to see the benefits faster. I would also like to express my sorry I don't know why I like I was going to go on with the
51:03
Simpsons meme for that but yeah those those two guys are really fantastic at just you know helping
51:10
lots of people on Twitter like I haven't spoken too much to Hamlet but like Charlie
51:16
has been instrumental in actually like me getting into Python for the very first time and why, you know, I'm having a lot of troubles
51:26
because as you, you know, highlighted before, just installing and, you know, starting to use like, you know, just any one
51:34
of the Python programs is a challenge I wasn't anticipating, but he's been instrumental at being able to help myself
51:45
and a lot of other SEOs out there. So shout out. Yes, it is a good shout out
51:52
And I would say the same with Hamlet Bassista that was really helped me too
51:57
And most of these people have also told their, I have a blog post where they tell
52:04
what were the best way to get started as well. So a lot of people have contributed to this
52:12
uh if you go into my python seo blog at some point they a lot of people have contributed and
52:20
this is what is good about this community like seo community is a tight comment like it a really great community but once you get into python seo then it a smaller community where like it like everyone wants to help each other out because we all
52:37
fans of python and we all love that and uh yes so if you need help just feel free to ping me as
52:46
well, I'll be happy to help. Amazing. Good shout out to both those two
52:56
So other shout out. So yes, we cannot go around without talking about J.R. Oaks
53:04
or even Vincent Teresi, which is another French, who this guy has done so much amazing work
53:11
Not of it is public, but he's made a translation machine learning model that is amazing
53:19
If you could translate all your content of your site instantly in any languages you want, you could do it
53:25
So it's good. Brittany Mullers has put a series of collab notebooks
53:31
People speaking French, go to Pierre Ruarch's blog. And Python is not everything
53:39
Like most programming, all programming languages are great. They have their upside, their downsides
53:48
I do love what Dave Sottimanos and John Merch has brought in their JavaScript scope
53:55
These guys are bringing like stuff that do actually compete pretty well with anything that Amlet Batista is promoting
54:06
Python is great, but yes, JavaScript could very well be the homepage as well
54:12
to do exactly everything we just saw we could do it in JavaScript
54:15
Or Julia, for example, is also possible to do that. So if you're interested in other amazing programming languages
54:25
go to Dave Satsumano, John Merch, or Ronardust blog, and you could also learn about those programming languages
54:34
so that was a lot hopefully you have own works and you're gonna do it and you're gonna come with
54:42
awesome scripts next i'd love to see what your you guys are coming up in a few months i'm pretty
54:49
sure it's gonna be great thank you so much that was that was awesome we cue the cue the round of
54:55
applause from all the people watching that we can't hear that was phenomenal um guys of course
55:03
like if you want to be able to reach out, I'm just going to basically echo exactly what you said
55:07
Please find him on Twitter or he's got a fantastic website with loads of fantastic blogs that he's
55:14
referenced all throughout his presentation. So yeah, thank you so much for your time, dude. That
55:20
was phenomenal. Cool. I'm happy. That was great. Hopefully I'll be there in the next
55:25
assisting to one of your presentation. Yeah, I'm really looking forward to when we're all going
55:33
back and, you know, being able to do this but face-to-face. We don't really have any more questions, but if you'll humor me
55:41
I have a couple of questions because I think this is kind of something
55:46
that maybe harks back to the last presentation for SEO Meetup that I ran with site migrations
55:54
But I was wondering, I'll just preface this whole thing with a little bit of a story if you can humor me
56:02
But with site migrations, one of the things that we need to look at is to be able to match up like either in the staging site to the new sites loads of strings of URLs
56:14
So a lot of the time we'll use things like, you know, fuzzy lookup to be able to go through and use a little bit of like NLP to be able to be able to match that up
56:24
For me, I have a Mac laptop. And so therefore I use like Google Sheets and things like that
56:32
And a lot of that is I find that, like, some of my janky workarounds don't really work all that well
56:40
and I've seeked other people's opinions on, you know, what do you use, and a lot of those people are PC users
56:49
And a bit of a shout-out to Peter Machinkovic, who has been in the comments, who's really been instrumental
56:57
to helping me see what it looks like on a PC and how you use fuzzy lookup using Excel
57:04
But we're also talking about Python right now and the limitations and, you know, things that we, you know, can break in Excel
57:17
So in your experience, is there something like that already exists, like with a Google Colab that basically does fuzzy string matching
57:27
or anything like that that you're aware of? So for, I'm not sure like the mapping from staging to dev would necessitate fuzzy
57:43
But I understand why you would need fuzzy. There is a fuzzy wuzzy package in Python that you can do fuzzy matching as well
57:53
It's a little bit, I haven't mastered it yet. every time I try to do it
58:00
I get into some trouble at some point. Yes. Why would you need fuzzy matching
58:10
Why would you just rewrite the URLs instead? I'm not sure why you need fuzzy matching
58:20
I understand why you would need it, but not in that case
58:26
Yeah, I think, you know, what's the function of what, you know, fuzzy lookup does is to identify from a URL string a commonality and smash it to another, you know, to another list with the same kind of commonality
58:44
And then basically place them next to each other and just basically say, like, you know, this is as accurate as we can we can be able to deduce
58:53
You know, this cuts down a lot of time. It's not a perfect solution, no doubt, but when we're dealing with, like, you know
59:01
like thousands, tens of thousands, hundreds of thousands of URL strings, you know
59:07
in two different lists or in some cases, like many different lists
59:12
I think just the ability to be able to lean on automated processes like, you know
59:20
using formulas or leveraging opportunities with Python to be able to help us do that
59:27
you know, it saves a lot of time. It saves a lot of headaches. And instead, it just now goes back to being able to go through and manually check that
59:37
but not actually have to go and manually scroll through a list and to find and match
59:42
So that's what I think is the great appeal to being able to use Fuzzy Lookup
59:49
to be able to find that. I'm not familiar with fuzzy wuzzy
59:55
Fuzzy wuzzy, it's a Python package that does exactly what's called. so it's the exact replica of it so you could do it one thing you could also do is just split
1:00:09
with python you can split all of these urls using a regex for example we have jobs in sales jobs in
1:00:17
melbourne victoria and i have marketing jobs in melbourne victoria and i don't have subfolders so
1:00:23
I use rejects to split that into components of the URL, and then I can get the location and the keyword
1:00:30
and group them by keyword or group them by location, even if the string is weirdly formatted
1:00:37
And I could do that in multiple languages quite easily because I'm using rejects instead of trying to do fuzzy lookup
1:00:44
So you can split all URLs into components if you want to do that
1:00:50
You can also lemmatize words. So basically if you have sales and sale, you could just remove back to S-A-L and they are all together
1:01:01
And it's kind of a fuzzy matching as well because you're just taking the common string
1:01:06
So for this kind, like when I did a site migration back when I was at Jabilico in Canada, that was the first part I had to do was to use Panda
1:01:18
Like that was the three months work I told you I made an error. I couldn't go ahead with that
1:01:24
But then I did the same work in three days because I could, with Python, instead of using Excel, get all the URLs from the site, match them to Google Search Console data, match them to Google ytics data, match them to logs data
1:01:38
And then I could do all of this stuff and make the redirection pattern because I could split those URLs like you just said by saying, okay, all these locations, they actually provide zero value
1:01:50
so I could take all these locations that provide zero value and just canonicalize them back to a broader location
1:02:00
So it is, like, once you get started into that, it is kind of a step to get started to it
1:02:07
But as I, yeah, so in three weeks after you started learning Python
1:02:13
you're going to reap the benefits of pandas over Excel in only three weeks
1:02:22
So in an agency, you need to build, like I think you're in an agency
1:02:29
but agency work, you need to build Python script that can be reused from clients to clients
1:02:36
So don't go too much into the specific of the URLs. You can just do a subfolder and yeah
1:02:44
use a fuzzy matching, which is good enough. uh yeah yeah i think um hopefully it answered your question yeah yeah um yeah it definitely
1:02:57
well it gave me an insight as to uh you know how you how you would tackle it and i think um
1:03:04
when when we've got an opportunity to basically just dictate what um you know the end outcome is
1:03:11
going to be, you know, that gives us like a world more flexibility. I think, you know, we've leaned
1:03:19
on trying to look for solutions like this in instances where we have, you know, we've got the
1:03:25
crawl of the existing site and maybe like a couple of other subdomains as well. And we're dealing
1:03:31
with a development agency who's basically said, you know, no, this is the list of URLs that we
1:03:37
are going to create and deal with it. And it's now up to us to bridge the gap
1:03:45
in between what we can see from our existing site and other things to what has been created And we need to find out fairly quickly as to you know where the discrepancies other than just being able to say like well this list A is bigger or smaller than list B
1:04:06
I think sometimes, particularly for me, because I like to do due diligence on URLs that will or won't be transferred over to the new site
1:04:17
I want to be able to see really, really quickly what hasn't been
1:04:23
what hasn't been like, you know, captured. So I can look at the organic traffic, keywords, and backlinks
1:04:29
potentially for those URLs that won't be captured. A lot of the times, you know, I think that's kind of like another case
1:04:39
of where I find using something like, you know, fuzzy matching for incomplete or inexact matches to be a really fascinating use case for this sort
1:04:52
of thing. And maybe I might just be going down a little bit of a rabbit hole and being very excited
1:04:58
as to how Python can bridge that gap. But we can always chat about that offline as well
1:05:05
Yeah. A lot of the place where I don't actually look at the URLs, I spend most of my time on
1:05:11
Google Search Console because it gives all of that data. But the thing is, you're limited when you don't use the API
1:05:18
But when you start using the API, you start seeing some overlap of URLs that are actually
1:05:24
cannibalizing themselves. And if you're saying you have a better replacement for all those URLs and you want to match it
1:05:32
you don't necessarily need to fuzzy match with the URL. You can actually look at all the keywords that are ranked for these queries and what is the percent overlap
1:05:43
And the one that have a really big overlap, you can all redirect them or canonicalize or do whatever you want
1:05:50
Update the content if you want. So I understand your use case
1:05:57
Another possibility would be to do entity yzing. So you yze the entities of your
1:06:04
all your, like if, I don't know how many URLs there are
1:06:08
but if it's a thousand or 2000, it's fine. Like it's in the hundreds of thousands
1:06:14
which is why it's just like, yeah. So in the hundreds of thousands
1:06:19
you don't go about and crawl all of them. Okay. You just, you go with the URLs, you try to understand
1:06:27
So you go, that's when you go into Google search console. That's when you dive deep and you match
1:06:32
You try to overlap the percent of keywords that are shared together
1:06:38
And you're going to find out that some of those URLs doesn't have the clicks
1:06:43
and they are cannibalizing. And you start with the ones that get zero clicks
1:06:50
But yes, the best is you use the API, you extract everything
1:06:54
and you don't even crawl your production site by doing that. And then you look at your keywords data and URLs data
1:07:01
And from there, you're doing this fuzz matching. More than crawling with Screaming Frog, which Screaming Frog is also a dangerous tool because you can crawl very fast a website and you could make your website down as well
1:07:15
And you could have your dev coming with axes and bows and arrows and try to hate you when you don't know what you're doing
1:07:24
so this is why learning web scraping is interesting because you can learn what's causing those
1:07:31
problems because screaming frog is maybe too powerful for some users yeah especially um i
1:07:39
love i love that you know you're encouraging a lot of people to use the apis again um you know
1:07:46
we can use apis with with python but again in screaming frog they got um you know they got Google ytics Search Console Majestic Moz and Ahrefs which is fantastic to be able to look at you know
1:08:02
at scale what, like, you know, what has happened historically on that site given, like, a different time period
1:08:10
to be able to have some kind of idea of, you know, again
1:08:14
organic traffic, keywords, backlinks, and a host of other really amazing metrics
1:08:21
But to do that, you need to crawl the website. You don't necessarily want to crawl the website
1:08:26
with millions of URLs to know that. So this is where it's kind of like
1:08:31
I would not want to impact production on that if you need to crawl a page
1:08:36
If you go ahead and crawl two million pages without telling anyone
1:08:41
or 10 million pages without telling anyone and you do that every time
1:08:45
just because you want Google Search Console data, that's not the way to go about
1:08:51
because you're crawling your site. If you're using the API, you get that data
1:08:55
then you get your URLs and you match them. Instead of trying to crawl
1:08:59
and do all of this all at once, you can do each of them separately
1:09:03
and you never crawl your website even once. So you never impact projection
1:09:07
and you get all this data. It's okay to crawl. It's okay to crawl
1:09:12
is just you don't do it like 20 times in a day
1:09:16
just because you want Google Search Console better. That's not the most efficient way to do it
1:09:23
Yeah, well, I'd be surprised if your machine was able to handle a site that was like 1 million URLs
1:09:29
plus 20 times a day. One, I think your machine would just kind of sound
1:09:34
like a jet engine. And two, I think, yeah. I think you would definitely come into a couple of issues
1:09:41
with some devs. Yeah, and also once you try, you load all your JavaScript
1:09:46
usually you've crawled through two millions of URLs and then you try to save it
1:09:50
and it takes two days just to save your file. So I love Screaming Frog
1:09:56
but yeah, in fact, crawling millions of URLs, I haven't done that
1:10:01
in quite a bit of time. So you don't need to crawl
1:10:05
an entire site ever or almost never. You can do it. It's all right
1:10:10
but most of the time you don't need to crawl an entire site. You need to do it at least once
1:10:15
to have all the scopes of URLs. If you don't have specific rules
1:10:21
sometimes the devs can give you the rules and you're going to have all the URLs
1:10:25
without even crawling. But yes, you need to crawl at some point
1:10:29
but you don't ever need to crawl the entire site. Yeah, awesome
1:10:35
I have one last question and I think we'll wrap it up there
1:10:39
What has been the coolest time-saving solution that you've used with Python to help you in your everyday tasks as a senior SEO specialist at Seek
1:10:56
I want to tell it. I don't want to tell it. So big time-saving is important
1:11:03
Tell us. And it's not even my script. It's my colleague's script that was looking at some metrics that we need to keep track of every day. And whenever they make a big change, those metrics change
1:11:19
and so basically for those metrics you make your report you yze you make counts you make
1:11:27
like so there are like 50 formulas going around to be able to make a report and ebay build it in
1:11:35
julia in fact and i translated it in python but basically uh the report you take one seconds
1:11:42
every month and that report is being made so it goes to the page that you need it looks at the data you want it extract it computes the data and it writes the report for you adding all those data for all the report and then you take one second and you copy
1:11:58
and paste it and you send it to your to your colleague so making reports but i don't want to
1:12:05
put it because i haven't cleared that up with anyone so i don't want to tell so use your
1:12:11
imagination. But yeah, building report, this is good because with Python, you can do all
1:12:17
of these things. So if, for example, you're looking at sales, I know
1:12:21
that one of you guys is working for e-commerce sites for Best Buy, I think
1:12:27
or not Best Buy, but another JBI I think. So if you're
1:12:33
looking at sales or the price of product, for example, you're trying to
1:12:38
look at no, uptime. Yeah, better, even better. If you're trying to look at
1:12:46
make a report for the out-of-stock product and you want to absolutely take a look
1:12:52
at your top 10,000 project, you want to make sure that they're never out
1:12:57
then you can look, scroll those pages every day or every week or every month
1:13:03
and go extract the out-of-stock tag. and then once you get it out of stock, you're saying zero
1:13:11
If it's stock, you say one, and then you compute all those metrics
1:13:15
And at the end, you're saying, okay, over the months, out of those 10,000 projects, we had plus, minus like 12 points
1:13:26
for example, saying 12 projects were out of stocks for one day
1:13:30
So you compute your metric, and at the end, you keep all that into a database and you make your graph of the number of out-of-stock product
1:13:41
per month. And every time you take that out-of-stock product per month and you send a report that is
1:13:49
automated, it's written to this month, we had 25,000 out-of-stock product, which is 12% more
1:13:57
than last month, which is 10% more than expected compared to last year
1:14:02
for the same period. And you send it automatically using the Slack API to one of your colleagues
1:14:09
that needs to know that, your boss or whatever. And that way you save yourself from giving that answer to anyone
1:14:18
You never have to do it yourself ever again because it does that automatically
1:14:23
So that would be a similar thing of what I was telling, but using with the JBI Phi Guy product
1:14:33
Which, as we can see in the comments, Peter Machinkovic, my good friend, is the JBI Phi Guy
1:14:40
and good on you i tried to do a case study on jbi phi and i could not
1:14:49
find a problem any problem with your out of stock product page so good on you i tried to make
1:14:59
a case study on that but good work for peter uh on jbi phi it's inspiring
1:15:07
Yeah, he's absolutely amazing. Thank you so much again for your time
1:15:15
Guys, round of applause. Thank you so much. This is utterly incredible
1:15:21
And, again, if you want to be able to reach out to Jean-Christophe
1:15:25
please – sorry, just saw the little comment from Peter. If you want to be able to reach out to him, find him on Twitter
1:15:34
at shunadjc or, of course, like at jcsunad.com. Thank you so much for your time, everyone
1:15:44
Have a wonderful evening, and I will see you in the new year
#Web Services
#Search Engine Optimization & Marketing
#Web Stats & Analytics
#Search Engines

