Skip to content

Latest commit

 

History

History
142 lines (98 loc) · 15.3 KB

ProjectDocumentation.adoc

File metadata and controls

142 lines (98 loc) · 15.3 KB

GET1033 Project Documentation

Automated Lighting & Doorbell/Alarm System

1. Project Overview

As mentioned in the Project Summary, this project aims to model after an automated lighting and doorbell/alarm system that activates only when it is dark, where the light will follow the person’s movements and will trigger a sound output when the person comes close to the door/sensor.

1.1. Motivations

Coming up with the design for this project proved to be a challenge as most of us in the group had relatively little or no experience in programming, prior to taking this module itself. However, we wanted to attempt building and programming something that could be useful and relevant to our everyday lives. After the lecture in week 7 on Physical Computing, where the Arduino was introduced to us, our group decided that we wanted to do an Arduino-based project as it was something alot more hands-on and tangible, which we generally preferred.

After some brainstorming and discussion, we landed on the idea of building an automated lighting system, which was inspired by the lighting systems used in toilets now that turns on only when there is movement detected inside the toilet. We wanted to extend this to an automated lighting system that can be used not just in the toilet but also along the hallway/stairs of a house, where finding the switch to the lights may prove to be inconvenient in and of itself when it is already dark to begin with. After we managed to reach our objective, we wanted to see if we could further improve on the project by making the lights more interactive - to track the movements of the user. This further evolved our project into something that can be used not just in the hallway or staircase of a house but also as a lighting system that can be installed outside the user’s house and can "guide" the user back to his house when it is dark.

We then added the finishing touches to the project by connecting the buzzer to the arduino, effectively making it a full alarm/doorbell system with automated lights.

1.2. Project License

This project is licensed under the GNU General Public License v3.0. Details of the license can be found here

2. Functionality

A video explaining how the system works can be found here. The flow chart below is a high level diagram that shows how we got the components working together in the way we wanted them to. The code we implemented followed this flow of logic as well.

Flow Diagram for code

As seen in the flowchart, the arduino would always first check if the environment was dark enough that the other parts of the system would be activated. If it was not, it will actively monitor the brightness of the environment. If the brightness goes below the threshold value that we set, it will then activate the proximity sensor to check the distance of the object or person (if any) in front of it. If the distance it reads is less than the threshold distance we set, it will activate the LED and the servo to point the LED towards the person. And if the distance the proximity sensor reads is less than the threshold we set for triggering the alarm, the buzzer will start playing the tune Hall of the Mountain King.

The sections below explain how we implemented each component.

2.1. Light Sensor

We made use of the Light Dependent Resistor (LDR) to determine the brightness of the environment. The LDR’s resistance value changes according to the brightness of light incident on its surface, making it suitable for our purposes. After setting the LDR up in a series circuit with another resistor, we were able to read the voltage across the LDR using an analog pin on the arduino and the in-built analogRead() function from the Arduino software.

2.2. Proximity Sensor

Initially for the proximity sensor, we were planning to use an Infrared LED and receiver. However, after consulting our tutor Dennis, we swapped over to the ultrasonic sensor instead. The ultrasonic sensor helps us determine the distance of objects in front of it by sending out an ultrasonic pulse and starting a timer then. The timer will stop when the ultrasonic sensor detects a returning ultrasonic pulse. The amount of time taken for the pulse to be reflected back to the ultrasonic sensor can be used to determine the distance of the object that reflected the pulse back to the sensor, as we know the speed of sound in air.

2.3. Actuators (Servo and LED)

Initially, none of us from the group knew about servos and how they could be used. Hence, our original idea was just to build and automated light system that will simply turn on when a person comes close. The idea to have the light follow the person’s movements started forming only after the tutorial we had in Week 10 where Dennis brought us to an electronics lab. It was then that he introduced the servo to us. After playing around with it in the lab and learning how it works, I discovered that we could use that in our project to make the automated lighting more interactive with the user. The distance readings from the proximity sensor was used to determine the amount of voltage being supplied to the servo (through the use of Pulse-Width Modulation), which subsequently determines the angle at which the servo’s arm will rotate to.

3. Constructing the Model

Picture of Model that was presented in class

The final model that we created was scaled down so that we could demonstrate it appropriately within the given space in our lecture venue. The cardboard represents the walk way towards the door of the house, with the light and proximity sensor (both built on the same breadboard) placed where the door would have been. The LED was then secured onto the servo using tape to secure not just the LED itself to the servo’s arm, but also to secure the connection from the LED to its cables. After these 2 main components were assembled properly, we then attached them to a piece of cardboard and calibrated the code according to the size of the model.

3.1. Timeline

The timeline here shows the progress of the project over the past few weeks leading up to the project submission:

  • Week 9:

    • Obtained components needed for project (shoe box, light dependent resistor (LDR), ultrasonic sensor, Arduino, resistors, cables etc.)

    • Read up and familiarised ourselves with the Arduino software.

  • Week 10:

    • Built proximity and light sensors.

    • Setup serial monitor to check that both sensors were working as intended.

  • Week 11:

    • Optimised sensors via calibration.

    • Began integrating the different parts of the code that we worked on.

    • Obtained servo to build the moving LED.

  • Week 12:

    • Debugged using results from tests.

    • Built the moving LED and tested code to ensure moving LED was working as intended.

    • Integrated code to direct the servo with the light and proximity sensor code.

    • Wrote the code for the doorbell/alarm tune (Hall of the Mountain King).

  • Week 13:

    • Debugged and got the doorbell code working.

    • Finalised model by securing the various components to cardboard.

    • Final calibration after model was finalised.

    • Cleaned up the code by removing unnecessary comments and additional lines that we included to help us debug in our initial phases of the project.

    • Final presentation and compilation of documentation.

4. Critical Analysis

This artifact can be considered an appendage robot, based on the types of robots that were introduced during the lecture and the definition of a robot given in Lecture 7 on Physical Computing - a computer that can sense the environment, plan how to react (think), and do something (act). It is a highly interactive robot as it responds to the movements of a person who comes close to the sensor.

The servo itself is a rather interesting device - in this artifact, it is used to point the LED at the person walking towards the door, acting as a Tangible User Interface (TUI). This is something that should be noted, as TUIs as a whole now are quite limited, in terms of their ability to change their form or physical properties in real time. However, while this limitation applies to the servo (there is a limited range of rotation of the servo arm), it is enough to achieve the purposes in which this artifact was created to achieve. It is interesting as well, as the servo, through electromagnetism (rotation of motor) and a control circuit, is able to convert something encoded digitally (the PWM that is sent to it) into a tangible change of state (the angle which the servo arm rotates to and stays at). This change of state in the servo arm rotation angle happens rather effectively in real time, and this allows the servo together with the LED, to act as a TUI that stays consistent with its underlying digital model (the distance reading computed by the proximity sensor and subsequently the amount of PWM output sent to the servo). In the reading on Radical Atoms, it is mentioned that it is crucial to the success of TUIs that there is a balance and strong perceptual coupling between the tangible and intangible representations. It seems likely that the reason for the success of the servo with the LED acting as a TUI is that it meets this criteria, where the tangible representation is the rotation of the servo arm, and the intangible representation being the PWM output sent to the servomotor.

Perhaps, the use of electromagnetism might offer a viable path towards successfully creating and implementing 'Radical Atoms'.

In addition, on the idea of procedural rhetoric, this artifact can serve as both a deterrent to intruders and as a sign of hospitality towards guests. The duality of purpose in which this artifact can be used causes it to hold 2 different rhetorics. Depending on the intentions of the people who visit this house, they will either get the message "Do not try anything funny, I am watching you" and "You’ve been warned" if they come with malicious intentions, or the message "Welcome to my interesting house!" to guests who are openly visiting and have nothing to hide. In our lecture in Week 4, we discussed the idea of how video games can make arguments through the use of procedural rhetoric. Through the artifact my group has made for the project, I realised that it is also possible for a single artifact to hold more than 1 rhetoric, if there are multiple purposes/contexts in which it can be used.

4.1. Affordances Analysis

In Lecture 2, we learnt about the 4 unique affordances of digital media proposed by Janet Murray, which are:

  1. Encyclopedic storage

  2. Spatialized representation

  3. Participatory engagement and

  4. Procedurality.

Based on these affordances, our group’s project would have the following profile:

affordances
  • Encyclopedic storage:

    • Low. For this artifact, there is not a lot of information that is stored in the arduino board. It has quite a limited memory and only needs to store the code logic and the specific threshold values that we have set and calibrated.

  • Spatialized representation:

    • Relatively low. For this artifact, there is no direct way for a user to directly navigate through the data that is stored or read by the arduino. However, the information that is stored by the arduino are connected in interesting ways that by observing the outputs (the LED and the servo), the user is able to deduce the following information:

      1. If the environment has dim lighting,

      2. If there is a person approaching his house,

      3. If there is a person very near his door, and

      4. Roughly where that person is in the walkway towards the user’s house door.

  • Participatory engagement:

    • High. This artifact involves a very high involvement of user input. The system only activates if there is a user present, and if it is dark enough. After that, any response from the system (movement of servo and continued lighting up of LED) is entirely determined by the actions of the user. If he moves closer, the servo and LED will track his movements. If he moves too close, the doorbell/alarm will be triggered. If he moves further away, beyond the threshold distance set, the LED will turn off and the servo will stop tracking him.

  • Procedurality:

    • High. As explained in the flowchart above, there is a fixed set of rules that the artifact follows in determining the output. This set of rules can be considered rather elaborate as they are chained together, where the fulfilling of 1 rule will cause the artifact to check for the fulfillment of another rule.

Overall, the high procedurality and participatory engagement levels results in the artifact appearing highly interactive when it is activated.

4.2. Building Off Other Projects Seen

As mentioned in the Project Summary, typical projects that involve automated lights will only switch on lights when a person walks past a sensor. In this project, we have tried to do something slightly different by making the automated light more interactive through the use of the servo.

Furthermore, during the project demonstration on the final lecture, we also saw another group that had a similar idea as ours and complemented ours really well - it was a lockdown system that keeps intruders trapped in the house until the police and relevant authorities have come to catch the intruder (and disable the alarm). Our project builds off on such projects well, where the focus is on keeping intruders locked in. If both systems were used together, if somehow the intruder gets past the external alarm, the internal alarm will still activate and lockdown, keeping the intruder locked inside the house.

5. Further Improvements

Upon retrospect, there are several ways in which this project can be further improved:

  1. Use of more proximity sensors.

    1. We found that the ultrasonic sensors were extremely reliable and accurate in determining the distance of an object right in front of it. However, it is not as accurate when the subject is standing to the left or right of the sensor. This inaccuracy becomes more obvious when the person is nearer to the sensor. By using a few ultrasonic sensors, it will allow the arduino to determine more accurately how much should the servo arm rotate by, allowing it to track the person’s movements more accurately. It will also allow a bigger area to be covered by the proximity sensor.

  2. Have 2 different tunes, 1 for alarm, 1 for doorbell.

    1. Currently, there is only 1 tune in the code that we wrote. However, if we wrote 1 more, then one could be used represent an alarm, and the other to represent the doorbell. This can allow the user (and even the 'visitor') to be able to differentiate with ease whether it is a security breach or a guest.

  3. Implement a button to switch between doorbell and alarm mode.

    1. This can give the user full control of how exactly he wants the system to be used.

  4. Use of a Real-Time Clock module

    1. The other alternative was to add a real-time clock module, which help turn the system into 'alarm mode' when it is late, or into 'doorbell mode' when it is in the day.