Baboon API & Baboon API Visualizer

What's that ?

🦧 Baboon API is a public API similar to the traditional "Dog and Cat APIs".

This tool allows developers to fetch photographies of baboons via requests and display them directly in the UI they're building.

This way, they can work on how images are positioned amongst the other graphical elements.

⬇️ Play with the frameworks and languages by grabbing them with your mouse !

project image 0
project image 1
project image 2
project image 3

Particularities

🤔 An API is already a technical concept. So far, your reading of this page might not make a lot of sense. So, how can anyone get baboon images without Googling it ? Only one solution : a visualization website !

🧱 Following the creation of the Baboon API, I used the Vue framework and the TypeScript language to create a small site showcasing the capabilities of the Baboon API. It allowed me to get back into Vue and experiment with an original design.

📡 An API is a server-side program. The one I created scans a folder full of photos and returns one or more signed URLs. However, the interest of such a project would have been kinda limited if the directory contained only 5 to 10 images... In this type of application, several hundred are usually available. Since I wasn't going to spend days scrolling the internet to save baboon photos, I decided to step into the world of web scraping !

🤖 With the Puppeteer library, it's indeed possible to automate this task in JavaScript. After spending a little time setting it up, I was able to launch a bot and watch it do all the dirty work in a few seconds !

Challenges

🎯 In my eyes, this project couldn't really have been considered "achieved" without its web (visualization site) and scraping (local tool to establish the image stock) counterparts. So, I had to be persistent and keep having my way through many different disciplines, some of which were entirely unknown to me (web scraping).

🧪 Beyond that, Bun and ElysiaJS are very young tools with young communities. Few resources are available, as few people have already tried them, which can make implementing seemingly simple features quite difficult.

Motivations

💡 As a developer, I occasionally use generic APIs. Often the same ones. And when they're animal-themed, it's always dogs or cats...

🚀 On the other hand, I was taking online courses on the Bun runtime and the ElysiaJS framework. It then occurred to me that I could create a public API just like the ones I used, using my new skills, while offering an new animal.

Next steps

🔍 Although building a web scraper was very instructive, I don't think I have the desire or need (for now) to repeat this kind of exercise. If that ever becomes the case, I would probably switch to Python : it's more common (and therefore easier) to scrape in that programming language than in JavaScript. One could very well imagine a role for AI in duplicate recognition and quality filtering (which I did manually this time)

📈 Currently, the API visualizer is equipped with a visit tracker using Google Analytics. If one day I notice that my tool has to handle loads of requests, then I will upgrade the application by implementing Docker and Kubernetes to the project, in order to ensure better distribution.