seanwilson 6 years ago

Hi! Not my post but I'm the developer behind the guide and the Chrome extension that goes with it:

https://www.checkbot.io/

The Chrome extension is a local web crawler that can crawl your website to automatically check your pages follow the 50+ SEO, speed and security rules from the guide. The Checkbot guide and extension recently went into public beta so please let me know what think. Thanks!

  • andrethegiant 6 years ago

    I used Checkbot for the first time last week, and it gave me useful suggestions to follow. So thanks for that!

    One suggestion I have is to be able to set cookies (much like you can set the user agent). The majority of my project is behind user auth (not HTTP auth) so Checkbot can't access those pages without hitting a 401.

    Also, I'm curious as to why you decided to make a browser extension rather than a standalone web service. Any particular reason why?

    • seanwilson 6 years ago

      > One suggestion I have is to be able to set cookies (much like you can set the user agent).

      Thanks for taking a look and for the suggestion! You mean you'd be happy with an interface where you paste in the cookie value you wanted to use? Would the cookie ever have to be updated to keep working?

      I'll need to think about the best way to implement this. For example, I could allow setting custom headers to support this and other customisations, or perhaps allow you to login via an iframe first to get the cookie data.

      > Also, I'm curious as to why you decided to make a browser extension rather than a standalone web service. Any particular reason why?

      A similar hosted web service would either be expensive or come with limitations to how many sites you can crawl and how often. For example, you might be limited to crawling a single production website and can only crawl it once a week.

      With the extension approach, you can crawl unlimited websites, recrawl as often as you want, recrawl immediately whenever you want and easily crawl localhost + private sites. You don't need to wait a week to discover an issue has hit production and wait again to find out your attempted fixes didn't work. This way you can test for issues at all stages of development (localhost/development, staging and production) and get immediate feedback if your fixes worked. That's the workflow that I want to support.

      • andrethegiant 6 years ago

        Yeah, I'm looking to add multiple cookie key/value pairs. They don't have to change or update once they're set.

        > That's the workflow that I want to support.

        Gotcha, makes sense!

  • deadcoder0904 6 years ago

    I'm not the developer but my post, LOL.

    Jokes aside. This will be really helpful for me when I am building my Gatsby powered Blog. Thanks for that. Also, what is the difference between CheckBot & LightHouse?

    • seanwilson 6 years ago

      So Checkbot gives you a simple way to check 1,000s of pages for issues in a couple of clicks (e.g. broken links, HTML/JS/CSS validation errors, mixed content issues) as well as identify problems you'd only find by testing groups of page (e.g. duplicate titles/descriptions/pages/resources). If you think you've fixed a problem, you can quickly recrawl a page or the whole site to confirm your fix worked. That's the workflow that Checkbot is geared to support.

      Thanks for giving Checkbot a try! Let me know how you get on with your blog as I'm keen to get suggestions on what can be improved.

finfun234 6 years ago

good work! this is excellent.