- read

FishyFish — Fish Classification Web Development

Ryan Lee 2

This project is part of [Silicon Valley Summer22 Bootcamp] hosted by Team-I

Github: https://github.com/2022SVBootcamp-Team-I/FishyFish

main page of fishyfish

I recently read about the “Tragedy of the Commons” which talks about how shared resources are easily exploited.


I won’t be going over every aspect of the topic. I just wanted to roughly share why our team decided to create a fish classifying web project.

The main purpose of the web project was to be able to classify the fish and provide regulations regarding it such as size limit and seasons. Later on, we also decided to provide information regarding the toxicity of the fish.

In order to create the web project, we chose the following “main” stacks:

Database: PostgreSQL

  • PostgreSQL is a powerful tool when using it with Django as a SQL database

Backend: Django/Django Rest Framework

  • Django is python based web framework that is fast and comparatively easy to build compared to other web frameworks such as Spring boot
  • Django is better than Flask when it comes to multiple page applications (doesn’t mean we will do multiple page applications but we gave it a try)

Frontend: React.js (on TypeScript)

  • React is a nice frontend tool when it comes to building user interfaces
  • It supports Atomic Design (reusing components)
  • Using it on TypeScript helps us eliminate possible errors (Considering that JavaScript is a language created in 10 days….)

AI: Yolov5 on Colab

  • Yolov5 is a machine learning tool based on PyTorch
  • A nice tool when it comes to making classifying models in a short period of time
  • Colab lets us train using free GPU

DevOps: Docker, AWS EC2 Instances

  • Docker is used to containerize our application
  • Helps us manage versions of components
  • With Docker, local installations or virtual environments are unnecessary
  • Application was deployed using AWS EC2 Instances

A detailed architecture is provided in the picture:


Database stores every data we need in our application

We configured our PostgreSQL database settings to Django and used PgAdmin during the development process.

fishyfish ERD


The main purpose of the backend is to take care of incoming requests, handle data (database), and return requests.

Using Django and Django Rest Framework, CRUD REST API was built. The API handles the following areas:

  • signup, signin, signout
  • image upload
  • image view
  • image delete
Swagger API Doc

Login was made using JWT encryption. We stored tokens in cookie when someone logged in and deleted cookie when logged off.

Check with Postman:

token created during login
token stored in cookies
token deleted from cookies after logout


The main purpose of the frontend is to take care of graphical user interfaces (GUIs).

Using React, we created components of pages (Atomic Design), helping us reduce time during the development process.

Because our group loves Retro-style aspects and wanted to make it fun, we adopted Nintendo and Gameboy designs into our pages.

Nintendo and Gameboy Style

It is not so much here in America but in places like Korea, (where most of the people in our group were from), fishing is a hobby for old folks so they wanted to make it attractive for the teens, hoping it resembles Pokemon Go.

Hopefully is kinda feels like that…


The main purpose of our AI is to be able to classify fish based on machine learning trained model

Using tools provided by PyTorch and Yolov5, we were able to train our model with hand labeled image data. (I never want to label images again…)

The following are our results:

result example halibut and rockfish
training result data

More information regarding the training result can be found here:


The purpose of the portion of the project it to containerize and deploy our project

We used Docker to containerize our application before deploying it on EC2.

You might ask, what is Docker and why do we need it?

From official docker.com:

Docker streamlines the development lifecycle by allowing developers to work in standardized environments using local containers which provide your applications and services. Containers are great for continuous integration and continuous delivery (CI/CD) workflows

In easy words, Docker solves problems such as “Hey, the version you are using isn’t working on my laptop!” Docker create its own environments for each “image” to operate on. More information regarding Docker can be found here:


We had other frameworks/tools that were used in the project.


  • We used nginx when deploying react app.
  • Nginx works as a reverse proxy


  • GUnicorn is a WSGI which creates and handles multiple instances when multiple people are using the application at once

S3 Bucket

  • We store our images here and stored a returned url of the image in our database


  • Prometheus is used to scrape and monitor data from our application
  • We monitor requests, memory, cache, and etc.
  • Grafana creates a visual interface of those data
a part of our grafana dashboard

Celery & RabbitMQ

  • Instead of creating multiple Django servers or Flask app server for AI, celery was used to handle AI requests. RabbitMQ takes care of queues sent from Django to Celery.


  • Well… We stored our source codes here…

Demo Videos:


And last but not least…

Our Team:

Github Profiles:

Ryan Lee: https://github.com/printSANO

Yeonjin Kim: https://github.com/homebdy

Heeyeon Son: https://github.com/fluorine1805

Joonhyeon Yong: https://github.com/pione3r

Jaebin Yu: https://github.com/Yujaebin