What Is The Idea Behind Bronitoring?

Jan 10, 2024 | Posts | 2 comments

Why not have a tool that can scan an entire website, identify broken links, gather page structure, analyze SEO tags, and evaluate page markup on-the-fly, providing a detailed report whenever an issue arises? Let’s take a closer look!

For individuals who own websites, operate web agencies, or work as solo freelancers building websites and apps, Bronitoring offers a convenient solution to thoroughly inspect every aspect of their sites before delivering them to clients or using them personally.

History

Being a software developer for more than 15 years, I have always been looking for new ideas for my products. Sometimes the motivation to build side projects was to practice programming skills, and other times it was straightforward: I wanted to create a product that I could use daily and that would also help others make their lives easier.

Everything I built before was for my personal needs, such as a price tracker service, habit tracker, marketplace, and freelance platform. Most of them I developed to practice and learn something new, like Next.js, Web3 stack, and Go.

Two Years Ago

I thought about having my own website monitoring service for many years in a row that can help me cut the amount of money paid for other services, such as UptimeRobot. Why not use them then? I did, and I’m still using such services, but they missed one important thing for me: I want to cover more than just uptime status.

So, I built my first tool that was written using Go, Ruby, and a plugin system, which covers everything: DNS checks, uptime status, the presence or absence of a specific word on a given URL. Nothing interesting there, just simple crawlers and analyzers.

How It Worked Before

I wasn’t good at frontend and UI back in the days, so it worked on my VPS and sent me reports to Telegram when something went wrong. I was not a huge fan of frontend development because I’m more into backend stuff, and every time I needed to build UI, I tried to use Twitter Bootstrap or something similar.

If you are a backend developer, you know exactly what I mean.

To summarize, the first version of Bronitoring was super simple and could only be used on my own VPS. I haven’t open-sourced it, and it’s still in my GitHub private repo.

These Days

A month ago, when I started thinking about redesigning this blog, I found that a few pages were not indexed on Google Search Console because of the <meta name=”robots”> tag.

When I create a new page, the first thing I do is block indexation by search robots and crawlers to focus on content, and only after that do I work with SEO tags and publish the page. Unfortunately, I forgot to remove this tag in my SEO plugin.

When I fixed this issue, I needed to wait a few days or even weeks to have that page available on Google.

How can we avoid such situations and know before any issues could happen? Add them to a website monitoring service that can analyze all pages 24/7/365 and sends us reports.

There are many alternatives, right?

We have many tools that can offer us such functionality. But how about having not only HTML tags analysis but even more, like checking the presence of Google verification code, validating Open Graph and Twitter cards markup, and having additional tags for LinkedIn and Pinterest.

Of course, we can combine all of these rules into UptimeRobot or other tools and check them in one place. We can go a step further and use multiple services that cover a specific scenario.

How about testing our DNS records to be sure that our domain emails will work? How about analyzing robots.txt and sitemaps? What if after updating the SEO plugin, we will have broken tags?

I got back to the idea of having such a service to cover everything I need in one single place. Why? Because I know what I want, and I understand what we need to have great websites.

Tech Stack

Since 2022, I’ve been a big fan of Next.js, so I decided to continue using this amazing framework and rewrite a platform to learn TypeScript and other stuff, like Jest for testing. I built the first version in one week, and it worked great.

I started thinking about how to make it better and useful for other people. I didn’t want to waste time building crawlers using Headless Chrome; the initial idea was to analyze a given URL on-the-fly and report immediately with any issues found on that URL.

Screenshots

What does Bronitoring support now

At the moment of writing, it works well for simple analysis.

We can identify missing or broken Open Graph and Twitter tags, social media images, invalid usage of icons and favicons, validate the length of title and description, as well as other tags. You just need to enter your page address and receive a report in seconds.

But it’s not enough to cover the main scenario: I want to know that everything is great with my entire website, not only with one single page. And these things must be monitored permanently across the globe.

Is It Available Already?

Yes! You can check out your website or a specific page by using this platform: https://bronitoring.alexweb3.com/. Currently, it’s hosted on a subdomain until I implement all the features I want to have. After this step, I will release the full-featured version on its own domain.

Roadmap

To achieve this, I need to develop a set of crawlers and parsers that will simulate real user behavior on a page to bypass CloudFlare and other firewalls.

Using just on-the-fly crawlers is not enough to measure page loading speed or validate redirects; I need to use a smarter approach to do so. I will use Ruby with Capybara for these crawlers. Also, I need multiple geographically distributed servers to track performance from different locations.

I don’t know; maybe one day, I will decide to have screenshots of how my pages look on mobile or tablet devices. In this case, a headless browser will help me a lot with such functionality. But for now, I need to focus on monitoring aspects: I want to know when my website is not available and get a fast report in my email or Telegram chat.

Here is my short roadmap

  • Implement crawlers to simulate browsers, not only HTTP requests.
  • Collect a website structure and display it in the UI.
  • Check for broken or dead links on each page.
  • Have permanent monitoring of my websites, as well as specific pages.
  • Check for the presence or absence of specific words on any page.
  • Add a registration process.

Let’s see how it goes.

Alex Kadyrov – Software Developer

Hi, I’m Alex. Thanks for your visit!

If you need any help with custom software development, web development, blockchain, or WordPress, feel free to book a free call with me.

2 Comments

  1. Pizofreude

    Love reading the thought process behind its development. MVP FTW!

    Reply

Submit a Comment

Your email address will not be published. Required fields are marked *

Other Articles You Might Love To Read:

The Importance of Having a Blog on Your Website

The Importance of Having a Blog on Your Website

In today’s digital age, a strong online presence is crucial for the success of any business. One of the most effective ways to achieve this is by adding a blog to your website strategy. Benefits of Having a Blog A blog on your website can be useful not only for...

How to Add Your Website to Microsoft Bing Webmaster Tools

How to Add Your Website to Microsoft Bing Webmaster Tools

I don’t like writing long articles explaining the importance of having your website in search engine results. Your website has to be everywhere, period. So let’s dive into details and add your site to Bing Webmaster Tools. Open Bing Webmasters and create an account if...