Skip to content

Web-based platform (Software as a Service) for energy metrics across all EU countries (ex. production ,consumption). This project was part of the subject SaaS at NTUA (8th term)

Notifications You must be signed in to change notification settings

nickbel7/ntua-saas

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NTUA ECE SAAS 2022 PROJECT | TEAM (20) - EnergyLive

Description + Technologies

The goal of this project is to create a Web Application in the form of SaaS !
The end user should be able to subcribe to our monthly service and we will provide him/her with detailed graphs and statistics of the energy data that all the EU countries produce each day
We provide authorization with google credentials as well as with our own authentication service !
Tools Used :
Javascript NodeJS Python Webpack PostgreSQL Kafka Docker
Platform Link :
🔗https://www.energy-live.com

Youtube Link :
🔗EnergyLive | presentation

👩‍💻 Contributors (Alphabetically)

Name Diagrams / Architecture Database API Frontend Parser / Data Importer Testing Kafka (Messaging) Cloud / Containers SSE
christidia ❤️
elinasyr ❤️ ❤️
kon-si ❤️ ❤️ ❤️ ❤️ ❤️ ❤️ ❤️
nickbel7 ❤️ ❤️ ❤️ ❤️ ❤️ ❤️ ❤️ ❤️
nikosece2019 ❤️ ❤️ ❤️

🏗️ Architecture (Microservices)

Component Sequence ER Deployment

How it works

The whole architecture is based on Microservices !
The client (frontend) first communicates throught restful APIs ( Total, Generation, Flows ) with the server utilizing SSE (server side events) for asynchronous and continuous data communication (this means that the servers stores all the client sessions and sends new packets to them when new data are inserted in the database). Each API's endpoint is a seperate microservice listening to its own port, thus, if one goes down then the rest of them will continue to function properly.

Now, what is it happening at the back ?
We have seperated our microservices corresponding to the three types of data streams (Actual Total Consumption, Generation per Energy Type, Energy Flows)
Each data stream has its own database (+1 which stores all the users), its own parser and importer.

Parser ( Total, Generation, Flows ) :
The parser microservice is responsible to get the new data from an ftp server, parse it by removing all unessasairy rows and columns, then zip it, upload it to the cloud storage bucket and notify the importer throught kafka.
One can triger the parser to download the latest csv from a REST API endpoint (ex. https://api-total-image-47nenum5kq-ew.a.run.app/total/api/parser/latest)

Importer ( Total, Generation, Flows ) :
The importer is the microservice that is responsible to "listen" to kafka for new incoming data and importing them into the database.
More specifically, it communicates with the parser microservice throught messaging and when a new csv arrives in the cloud storage bucket, the importer downloads it, unzips it, deletes the appropriate data of that month and then inserts the new. When that process finishes, it notifies the API microservices (throught messaging - Kafka) with the first and last date of the new data, so that the endpoint that stores all the client sessions does not have to notify the ones that have not requested data that were updated !

☁️ Deploy (Google Cloud Platform)

Technologies Used

  • Cloud Run
  • Container Registry
  • Cloud SQL
  • Cloud Storage

🪟 Frontend

Technologies Used

  • Html, SCSS, JS, Ajax
  • Web-pack
  • Open Layers (maps)
  • Nunjucks

🐛 Testing

For more detailed info go to testing folder

About

Web-based platform (Software as a Service) for energy metrics across all EU countries (ex. production ,consumption). This project was part of the subject SaaS at NTUA (8th term)

Topics

Resources

Stars

Watchers

Forks