Development of the Autonomous Surveillance Drone

Quadcopters are basically drones with four motors and propellers. Because quadcopters can’t rely on the aerodynamic lift that acts on the wings of conventional airplanes it becomes very hard to control them manually. Especially for a slow reacting human in comparison to machines. Thus most drones rely on modern sensory techniques which are located onboard to detect the current dynamics  and behaviour of the drone and adjust the motor speeds many times per second if the drone seems unbalanced to the sensors.

As a consequence the drone will self-stabilize itself. When the dynamic stabilization is handled by the machine itself the pilot doesn’t have to take care of this and can control the drone’s position by adjusting the throttle, pitch, roll and yaw axis semi-manually. If we go one step further we can eradicate the eventual faults a human would make by substituting the pilot with a software algorithm so the drone is guided by a software.

Schematic of the 4 axes

To start off and explain the actual context of this project, the goal was to win a MINT contest. The contest itself is called “Schüler machen MI[N]T” and is being carried out every year by two companies. Grunewald GmbH in Bocholt and Clyde Bergemann in Wesel. This year the subject was “Safety first”. That being said we searched for a possible solution that would enrich the daily life with regard to security. For me the decision was pretty simple. A few weeks ago I watched the TED Talk by Raffaello D’Andrea who presented the swarm intelligence of drones. I was pretty impressed and wanted to build a drone by myself.

So I suggested to do something with a drone. The other team members agreed with me but it was easier said than done. After a little bit of research the solution was clear. We wanted to build an autonomous surveillance drone to supervise big industrial areas. The advantages over traditional supervising systems like stationary cameras are great. At first a drone isn’t mounted stationary to a fence or a high pillar. It can fly around the facility without missing any corner. Furthermore an autonomous drone doesn’t require any external input and it can guide itself. The drone can be adjusted to the needs, with for example a thermal imaging camera. With the help of these modern techniques an old conventional surveillance system is simply out-dated in comparison with an air surveillance system.

 

The electronic concept

 

A quadcopter consists of an internal measurement unit or IMU, four motors with propellers, four ESC’s (electronic speed controller), a chassis, a LiPO Battery and a radio unit. In our case we just bought the professional electronical components like the IMU from DJI because there is no sense in building it by yourself. If you did it, it would further complicate the prototype and thus increasing the feasibility of failure. The IMU ouf course needs a signal to know what to do. This signal is a PWM wafe function which is emitted by a self programmed Arduino MEGA 2560 with an ATmel2560 microcontroller.

The Arduino MEGA receives input signals from a human user or afterwards the software and sends these commands to the DJI NazaM Lite (IMU) which then executes the given commands and stabilizes the quadcopter. Thereby the ATmel2560 acts as a radio unit. The great advantage of an Arduino MEGA as a radio unit is, that you can program it to your will. So you can either just guide the drone by yourself or attach multiple sensors, as we did it, to detect possible obstacles and programm the autonomy. Condensed to the minimum the trick is to manipulate the IMU and let the onboard ATmel2560 act as a radio receiver.

 

Schematic of the curcuit diagram

 

This schematic diagram represents how the drone works. On the left you can see the transmitter of the radio unit in a simplified view. It consists of another self programmed microcontroller with 2 joysticks and a few buttons. The microcontroller is an Arduino UNO. To get the signal from the joysticks you need to know how a joystick works. A joystick is basically an arrangement of one potentiometer per axis. One for the x-axis and the other one for the y-axis. A voltage is being applied to both of them and as you should know, a potentiometer can change its resistance.

The transmitter

After Ohm’s Law the drop off voltage is being calculated via R=\displaystyle\frac{U}{I} \leftrightarrow U=R*I. In this case we can use the formula for the ohmic resistance because we don’t have to deal with inductances or capacities and therefore no complex resistances which makes the whole thing a lot easier. Otherwise we would’ve used X_{c}=\displaystyle\frac{1}{\omega*C} for the resistance of a capacitor or X_{l}=\omega*L for an inductance.  So if the resistance increases, the voltage which declines over the potentiometer also increases. With the ATmel328 and its analogports we’re exactly measuring the voltage that remains after the loss over the potentiometer.

The analog-to-digital (A/D) converter on the ATmel328 has a 10 bit 2^{10}=1024 resolution, returning integers from 0-1023. If we now divide 5 volts (that is usually be used in these kinds of circuits) by 1024 we will get the voltage resolution: \frac{5V}{1024}=0.0048828125V \leftrightarrow 4,8mV That should be good enough for our purpose. The next step is to map this signal in a range of 1220-1720 because that is the range the DJI Naza understands. This can be done easily by using the Arduino map() function. Then the signal will be send via a WIFI-module to the mainprocessor onboard the quadctoper to be merged with the data from the GPS and the IR Sensor.

PWM wafe-function

You may be asking why the signal comprises exactly the interval from 1220-1720. Now comes a little digression. The “signal” of which I’m speaking all the time, is a PWM waveform. Pulse-width-modulation is a modulation technique used to control the power supplied to electrical devices. It uses a rectangular pulse wave whose pulse width is modulated resulting in the variation of the average value of the waveform. To give a deeper awareness let’s consider a pulse wavefrom f(t) with period T and both y_{min} and y_{max}. In addition to that a so called duty cycle D is required. The duty cycle D is defined as the ratio between the pulse duration, and the period T  of a rectangular waveform. The then resulting average value of the waveform is given by \bar{y}=\displaystyle\frac{1}{T}\int_{0}^{T} f(t) dt. It’s as easy as that!

The interval ranging from 1220-1720 results from the pulse width. A modern RC servo position is defined by the width of the pulse and not by the PWM duty cycle. This is different to for example some DC motor speed control. There is the average voltage the key factor for the speed control. The majority of RC servos are expecting a pulse width from 1000\mu s - 2000\mu s. The DJI NAZA also works with this kind of PWM signal but in a range from 1220\mu s - 1720\mu s. After all that is the signal which the NAZA will receive. But before it will be outputted to the NAZA, the IR sensor has to check if an obstacle is blocking the way and in case interfere the whole process .

 

 

 

Obstacle avoidance system via the usage of lasertriangulation and infrared light

 

At first I tried to accomplish this with an ultrasonic sensor. The theory was more than clear. It should emit an ultrasonic sound wave which will be reflected by the obstacle, then measure the time this wave needs to return to the sensor and hence calculating the remaining distance after s=\displaystyle v*t. The software then decides wether the object is to close or not and interfers if necessary. But as my teacher always says: Experience is what you make when you need it. After a lot of hard programming and attempts I concluded that the three phase induction motors are emitting soundwaves with a similar frequency as the ultrasonic sensor is working with. Obviously this can’t work and the experiments approved this. The drone was terribly distracted and literally couldn’t even lift off.

 

Anyway I needed a solution. Therefore I remembered that so called infrared sensors are existing. They operate with various techniques like measuring the reflected infrared radiation of an object to determine the distance. Another method would be the travel time measurement where a short light pulse is emitted and the time \Delta t which the lightwave does need to reflect from the object and come back to the source is being measured. Therefore the distance l is equal to l=\displaystyle\frac{c*\Delta t}{2*n} where c stands for the speed of light which is being reduced by n the refraction index of the respective medium. The factor 0,5 results from the lightwave travelling the distance two times. The big disadvantage of this method is the necessary very short measuring to the point of 1*10^{-9}s. If you do the math you will figure out that it is only possible to get a resolution of around 30 cm which isn’t very accurate. However distances up to 10 km can be measured very easily with this method. Furthermore it gets more expensive with the decrease of distance.

 

Therefore I decided to go for an IR sensor which uses the so-called lasertriangulation method. An IR sensor is low priced compared to the travel time measurement method. Furthermore a sensor which is based on lasertriangulation allows to measure the distance very accurate especially between 0 – 40 m. Lasertriangulation means that a laser beam is focused onto an object. Next to the laser is a little camera installed which oberserves the position of the laser beam. If the distance between the sensor and the object changes, the angle under which the laser point is observed also changes and thereby also the position on the camera sensor. With the changing position on the camera sensor and trigonometric functions you can calculate the distance. To make it more clear what happens: here is a schematic.

Basic principle of lasertriangulation

All in all the sensor does these calculations by itself and outputs a voltage which after the datasheet looks something like this.

Voltage to distance ratio of the IR sensor

This shows perfectly that the sensor is only usable within the range of 100-500 cm because you have to find a function that maps the voltage into distance. For me as a student without the knowledge of advanced mathematics it is only possible to find a function that begins after the global maximum. Therefore I hooked the sensor up to an arduino and measured the voltage reads of various distances. With the help of Excel it is possible to do a regression-analysis and find a possible function for the voltage to distance ratio.

Excel regression analysis | The graph correlates with the diagram from above

But this function only outputs as f(s); s=distance the voltage. Thus the inverse function f^{-1}(s) has to be formed. With the inverse function it is possible to have the voltage output of the sensor as an input for the function and get a calculated value s as the output. With the voltage readings and appropriate distances Excel figured out that the optimal function is f(s)=2476,3s^{-0,358}. The inverse function is \displaystyle f(u)=\frac{3,01924*10^{9}}{u^{2,79329}} Now it is possible to measure the distance of an object that is located in front of the drone and might lead to a crash. So a threshold can be set from which the drone starts to fly backwards to not crash into the object. That’s as far as the theory goes. Now comes the practice.

 

The Fraunhofer lines and their theoretical correlating impact on IR sensors

 

As I’ve learned in my physics advanced course so called “Fraunhofer lines” do exist in the natural solar spectrum. They are typical absorption lines that can be easily seen on a spectrum analysis. According to the datasheet the Sharp GP2Y0A710K0F IR Sensor operates  within a wavelenght of round about \lambda =870 \pm 70\,nm . My assumption regarding the \pm 70\,nm is that this leads back to the production process that can’t be controlled by 100% so Sharp can not guarantee the exact wavelength of 870\,nm. Now to come back to the impact of the Fraunhofer lines on IR sensors I will show a picture of a solar spectrum analysis conducted here on earth.

Solar spectrum analysis | The Fraunhofer lines can be seen very easily

The wavelength in nm is plotted on the X axis and the corresponding radiant intensity is plotted on the Y axis. As you can see some radiant intensity irruptions are occuring. The intensity suddenly drops heavily. These are the so called “Fraunhofer lines”. They result from the light that is being emitted by the sun and that has to pass through the photosphere, the outer region of the sun. That has lead to the historical assumption by Johann Balmer  that specific chemical elements have to exist in the photosphere that absorb the radiant intensity. Therefrom I concluded that it might be reasonable to use an IR sensor that operates within the range of one of those Fraunhofer lines. Theoretical most of the radiant intensity is already absorbed and lightwaves of the wavelength 870\,nm shouldn’t occur on earth. In this case the big gap shortly before 1000\,nm is the Fraunhofer line I’m interested in. At about 898,765\,nm the chemical element O^{2} is being absorbed by the suns’s photosphere. Therefore the IR sensor can operate easily without being interfered by daylight. And as the practice showed I was right. The IR sensor works flawlessly!

 

The finite element analysis 

 

Of course the drone was developed further and further in every aspect over time. A good example is the enhancement in stability of the drone’s legs. At the beginning I designed a CAD model that looked nice but after some testflights it proved to be terrible in stability. When the drone was landed to fast, the arm broke all the time at one specific point. Therefore I decided to conduct a finite element analysis which is well-known among engineers. The finite element analysis is a computerized method for predicting how a product reacts to real-world forces, vibration, heat, fluid flow and other physical effects. It shows whether a product will break, wear out, or work the way it was designed. The original leg or v1 looked like this.

The drone in a very early stage including v1 of the legs

The legs always broke shortly before the motor mount. To advance the leg’s stability I carried out a finite element analysis in the simulation module of Autodesk Fusion 360. Therefore I added a constraint where the arm would be normally attached to the drone and applied a force of 20 N to the lower part of the leg. This is represented via the blue arrow.

The colors represent the stress load over the part. Blue means that there is very little stress and red means it will break. As you can see very precisely the greatest stress is located near the motor mount where the leg is attached to the drone arm. Thus I could further improve the stability and the overall landing gear construction that now looks like this.

The drone’s landing gear

To give a résumé I can say that until now it was tough. I had to learn how to use a CAD program, how to program in C++ and how to conduct a finite element analysis. I had to lead a team of four students and therefore was not only responsible for the motivation and coordination but also for the outcome of the project. These are only some of the skills that I’ve learned over the past 10 month. But I think the most important part is, that this project taught me the two fundamental thoughts. Never quit even if it might not make sense on paper. Because even if you fail you’ll learn something new. That taught me patience and persistence.