The goal of this project is to create a Web Application in the form of SaaS !
The end user should be able to subcribe to our monthly service and we will provide him/her with detailed graphs and statistics of the energy data that all the EU countries produce each day
We provide authorization with google credentials as well as with our own authentication service !
Tools Used :
Platform Link :
🔗https://www.energy-live.com
Youtube Link :
🔗EnergyLive | presentation
Name | Diagrams / Architecture | Database | API | Frontend | Parser / Data | Importer | Testing | Kafka (Messaging) | Cloud / Containers | SSE |
---|---|---|---|---|---|---|---|---|---|---|
christidia | ❤️ | |||||||||
elinasyr | ❤️ | ❤️ | ||||||||
kon-si | ❤️ | ❤️ | ❤️ | ❤️ | ❤️ | ❤️ | ❤️ | |||
nickbel7 | ❤️ | ❤️ | ❤️ | ❤️ | ❤️ | ❤️ | ❤️ | ❤️ | ||
nikosece2019 | ❤️ | ❤️ | ❤️ |
Link to .vpp (Visual Paradigm) file
Component | Sequence | ER | Deployment |
---|---|---|---|
The whole architecture is based on Microservices !
The client (frontend) first communicates throught restful APIs (
Total,
Generation,
Flows
) with the server utilizing SSE (server side events) for asynchronous and continuous data communication (this means that the servers stores all the client sessions and sends new packets to them when new data are inserted in the database). Each API's endpoint is a seperate microservice listening to its own port, thus, if one goes down then the rest of them will continue to function properly.
Now, what is it happening at the back ?
We have seperated our microservices corresponding to the three types of data streams (Actual Total Consumption, Generation per Energy Type, Energy Flows)
Each data stream has its own database (+1 which stores all the users), its own parser and importer.
Parser (
Total,
Generation,
Flows
) :
The parser microservice is responsible to get the new data from an ftp server, parse it by removing all unessasairy rows and columns, then zip it, upload it to the cloud storage bucket and notify the importer throught kafka.
One can triger the parser to download the latest csv from a REST API endpoint (ex. https://api-total-image-47nenum5kq-ew.a.run.app/total/api/parser/latest)
Importer (
Total,
Generation,
Flows
) :
The importer is the microservice that is responsible to "listen" to kafka for new incoming data and importing them into the database.
More specifically, it communicates with the parser microservice throught messaging and when a new csv arrives in the cloud storage bucket, the importer downloads it, unzips it, deletes the appropriate data of that month and then inserts the new. When that process finishes, it notifies the API microservices (throught messaging - Kafka) with the first and last date of the new data, so that the endpoint that stores all the client sessions does not have to notify the ones that have not requested data that were updated !
Technologies Used
- Cloud Run
- Container Registry
- Cloud SQL
- Cloud Storage
Technologies Used
- Html, SCSS, JS, Ajax
- Web-pack
- Open Layers (maps)
- Nunjucks
🐛 Testing
For more detailed info go to testing folder