====== Getting started with the Grove Arduino Kit ====== ===== What is Grove? ===== Grove provides an easy way to connect sensors (components that can measure things) and actuators (components that produce movement/light/sound) to an Arduino microcontroller. Where the usual prototyping is done on breadboards, with messy wiring, Grove uses a uniform connection system so all modules are plug-and-play. This means you don’t have to worry as much about connecting everything correctly, and can spend more time on ideation and implementation. ===== 1. Installing the Arduino IDE ===== //(If you already used Arduino before, feel free to skip steps 1 & 2)// In order to be able to upload code to the Arduino, the Arduino IDE is needed. You can install the latest version from [[https://www.arduino.cc/en/software|the official Arduino site]]. You can find detailed instructions on how to install the IDE here for [[https://docs.arduino.cc/software/ide-v1/tutorials/macOS|MacOS ]]and [[https://docs.arduino.cc/software/ide-v1/tutorials/Windows|Windows]] (The Windows-store version may cause some problems, so it’s recommended to install it from the website directly) During the installation process, make sure to check all the boxes and give permission to install all drivers. ===== 2. Navigating the Arduino IDE ===== You can find a detailed explanation of the Arduino IDE [[https://docs.arduino.cc/software/ide-v1/tutorials/arduino-ide-v1-basics/|here]]. To familiarize yourself with Arduino, please follow [[https://create.arduino.cc/projecthub/yeshvanth_muniraj/getting-started-with-arduino-bcb879|this tutorial]]. It covers how to: * Connect the Arduino * Verify and upload sketches to the Arduino * Run the ‘blink’ program You don't have to be a master at Arduino coding for now; just the basics will suffice! ===== Building GroveBot ===== Now that everything is set up, we can start building our first robot using Grove! We’ll be recreating the example below, consisting of 2 hexagonal LED-matrices (eyes), 2 speakers with an MP3 module, a face-tracking camera, and a touch-sensor. The robot is based on the [[https://edwindertien.nl/animatronics/eyepi/|EyePi]] ==== MP3 Player/Touch sensor ==== * First, put the Grove shield onto the Arduino. * Make sure the little VCC switch on the side is set to +5V. * Then, connect the Touch Sensor module to A0 on the Grove shield. * Then, connect the MP3 Module to D2 on the Grove shield. * Connect the audio jack of the speakers with the connector on the MP3 module. * Connect the USB cable to any USB power source (Laptop, power brick, power bank, etc.). * We’ll still need some files to play. Put a few MP3 files onto the micro-SD card and put the micro-SD into the MP3 module. You can create your own audio files, by recording them yourself or using e.g. [[https://ttsmp3.com/]]. The latter should be a bit quicker for now. * Next, we’ll need to install the Grove MP3 library. To do this, go to Sketch->Import Library->Manage Libraries. Search for ‘Grove Serial MP3 Player V2.0’ and click ‘install’. Now run the following code: // Let's start by including the needed libraries #include #include // Then we define global constants #define TOUCH_PIN A0 // And the rest SoftwareSerial mp3(2, 3); bool prev_touch_value = 0; void setup() { // put your setup code here, to run once: pinMode(TOUCH_PIN, INPUT); // Since we want to read the value of the touch sensor, we set it to INPUT // MP3 mp3.begin(9600); // Open a connection to the MP3 module delay(100); // Wait 0.1 seconds for the MP3 player to boot up SelectPlayerDevice(0x02); // Select SD card as the player device. SetVolume(0x1E); // Set the volume, the range is 0x00 to 0x1E. } void loop() { // put your main code here, to run repeatedly: bool touch_value = digitalRead(TOUCH_PIN); // Get the current state of the touch sensor (true/1 = touching, false/0 = not touching) if (touch_value && !prev_touch_value) { // If we are touching the sensor now, but weren't previously; PlayNext(); // Play the next MP3 file } prev_touch_value = touch_value; // Store the previous touch state } You should now be able to press the touch sensor and make an MP3 file play! ==== LED Matrices ==== Next, connect the **2 hexagonal LED matrices** to **D3** on the Grove shield. Since the LED matrices are built upon the WS2812B LEDs, commonly used in LED strips, we can **chain together** as many as we want while only using one Grove connector. Make sure that the signal from the Arduino goes to the connector marked as **‘in’** on the LED module, and goes out from the connector marked as **‘out’**. Next, we’ll need to install the Adafruit Neopixel library. To do this, go to //Sketch->Import Library->Manage Libraries//. Search for **‘Adafruit NeoPixel’** and click **‘install’**. We can make sure everything is connected properly by running the example code: Go to //File->Examples->Adafruit NeoPixel->strandtest// Change **LED_PIN** to 4 and **LED_COUNT** to 74, then verify and upload the sketch. This should display some colorful animations on the matrices. Fancy! ==== HuskyLens ==== The HuskyLens is a very easy-to-use AI camera. It has an on-board mini computer to do all the computations, and a built-in screen to provide real-time feedback. If you want to ensure the best performance, you can **update the firmware** of the HuskyLens by connecting the module to your PC with a micro-USB cable, and following the steps on the [[https://wiki.dfrobot.com/HUSKYLENS_V1.0_SKU_SEN0305_SEN0336#target_5|HuskyLens wiki]]. However, this is **not required** and this tutorial should work fine with the out-of-the-box version. As the HuskyLens uses a different connector, we can’t use the standard Grove cable for this one. Instead, use the **special cable** to connect it to any I2C port on the Grove shield. You can use some jumper wires to connect it to a normal grove cable as shown in the image. (red = VCC, black = GND, blue = SCL, green = SDA) Once the HuskyLens is connected, it should automatically turn on and show the live camera feed on the built-in display. There are 2 buttons on the HuskyLens: * **The normal push button is for learning**. Point the camera at a face/object, click the button, and the HuskyLens will ‘remember’ that face/object. Pressing it again will forget it. (In normal operations only 1 face/object can be learnt at any given time. If you want to identify more, please refer to the guide below) * **The dial button is for selecting different functions/settings**. Dial to the left/right to switch between face, object, line, etc. detection. In order to make Arduino talk with the HuskyLens, we will need to install a library. Since the library isn’t available in Arduino’s built-in library manager, we have to install it manually. To do this, download the following folder [[https://github.com/HuskyLens/HUSKYLENSArduino|from Github]]. You can click on the green ‘Code’ button and select ‘Download ZIP’. Unpack the zip, and copy only the ‘HUSKYLENS’ folder (we don’t need the .md files) to the Arduino/libraries directory on your PC/laptop. A restart of the Arduino IDE is needed to make it load in the library. (In case it doesn’t work, you can also zip the HUSKYLENS folder and use Sketch->Include Library->Add .ZIP Library) Next, we'll have to set up the HuskyLens to talk over I2C. To do this, turn on the HuskyLens and dial to the right until you see 'General Settings'. Click the dial to enter the settings. Then dial and select 'Protocol Type' and then dial and select 'I2C'. Make sure to also 'Save & Return' to save your changes. After that, you can set it to 'Face Tracking' again. Now, run the example code in File->Examples->HUSKYLENS->HUSKYLENS_I2C and confirm that everything works. (For more in-depth information on all the abilities and modes of the HuskyLens, the following resources may be helpful:) [[https://www.youtube.com/watch?v=E140gPLPz4A]] __[[https://wiki.dfrobot.com/HUSKYLENS_V1.0_SKU_SEN0305_SEN0336#target_4|https://wiki.dfrobot.com/HUSKYLENS_V1.0_SKU_SEN0305_SEN0336#target_4]]__ ==== Servo’s ==== In order to move the head of the robot, we use 2 servos in a pan/tilt configuration. The provided servos are the MG996R, and can rotate 180 degrees. Since they consume quite a bit of power, we cannot connect them directly to the Arduino. Instead, we need to power them separately. For this tutorial, we power them using a power bank and a custom-made connector. Connect the servos as shown in the image to D6 of the Grove shield. ==== Assembly ==== Now that we’ve connected and tested all components, we can build the robot! If done correctly, the connections should be as follows: MP3 MODULE -> D2 LED MATRIX -> D3 SERVOS -> D6 TOUCH SENSOR -> A0 HUSKYLENS -> I2C The head can be assembled from a piece of cardboard. Hexagonal holes are cut into the cardboard to press-fit the eyes in. As an example, I created a base from cardboard as well to place the touch sensor and HuskyLens on, but feel free to create your own creative solution for this. The head I built is 6x4x14 cm. ==== Upload the code ==== We’ve already done the hard work for you and written some code to make this robot come alive: /* Example code for the Social Robot Interaction course Code Written by Sjoerd de Jong Connections: MP3 MODULE -> D2 LED EYES -> D3 SERVOS -> D6 TOUCH_SENSOR -> A0 HUSKY_LENS -> I2C */ // --------------------------------------------------------------------------------- // // ----------------------------------- VARIABLES ----------------------------------- // // --------------------------------------------------------------------------------- // // Let's start by including the needed libraries #include #include #include #include "HUSKYLENS.h" #include // Then we define global constants #define LED_PIN 4 #define NUMPIXELS 74 #define TOUCH_PIN A0 #define SERVO_PIN_1 6 #define SERVO_PIN_2 7 // And the rest SoftwareSerial mp3(2, 3); // The MP3 module is connected on pins 2 and 3 Adafruit_NeoPixel pixels(NUMPIXELS, LED_PIN); int LED_BRIGHTNESS = 8; // 0-255 HUSKYLENS huskylens; HUSKYLENSResult face; bool face_detected = false; bool prev_touch_value = 0; enum Emotion {NEUTRAL, SUPRISED, HAPPY, ANGRY, SAD}; Emotion emotion = NEUTRAL; Servo servo1, servo2; float servo1_pos = 90, servo2_pos = 90; float servo1_target = 90, servo2_target = 90; float servo1_speed = 0, servo2_speed = 0; long timer1, timer2, timer3; bool pc_connected = false; float servo1_target_pc = 90, servo2_target_pc = 90; // --------------------------------------------------------------------------------- // // ---------------------------------- EYE PATTERNS --------------------------------- // // --------------------------------------------------------------------------------- // // We store the different eyes as a byte array, which is both space-efficient and readable // You can also use https://sjoerd.tech/eyes/ to quickly design your own eye patterns byte neutral[] = { B0000, B01110, B011110, B0111110, B011110, B01110, B0000 }; byte blink1[] = { B0000, B00000, B011110, B0111110, B011110, B00000, B0000 }; byte blink2[] = { B0000, B00000, B000000, B1111111, B000000, B00000, B0000 }; byte suprised[] = { B1111, B11111, B111111, B1111111, B111111, B11111, B1111 }; byte happy[] = { B1111, B11111, B111111, B1100011, B000000, B00000, B0000 }; byte angry[] = { B0000, B10000, B110000, B1111000, B111110, B11111, B1111 }; byte sad[] = { B0000, B00001, B000011, B0001111, B011111, B11111, B1111 }; // --------------------------------------------------------------------------------- // // ------------------------------------- SETUP ------------------------------------- // // --------------------------------------------------------------------------------- // void setup() { // put your setup code here, to run once: pinMode(TOUCH_PIN, INPUT); // Initialize the leds pixels.begin(); // Serial communication Serial.begin(115200); // MP3 mp3.begin(9600); delay(100); // Wait 0.1 seconds for the MP3 player to boot up SelectPlayerDevice(0x02); // Select SD card as the player device. SetVolume(0x1E); // Set the volume, the range is 0x00 to 0x1E. // HuskyLens Wire.begin(); while (!huskylens.begin(Wire)) { Serial.println(F("Begin failed!")); Serial.println(F("1.Please recheck the \"Protocol Type\" in HUSKYLENS (General Settings>>Protocol Type>>I2C)")); Serial.println(F("2.Please recheck the connection.")); delay(100); } // Servos servo1.attach(SERVO_PIN_1); servo2.attach(SERVO_PIN_2); servo1.write(90); servo2.write(90); } // --------------------------------------------------------------------------------- // // ----------------------------------- MAIN LOOP ----------------------------------- // // --------------------------------------------------------------------------------- // void loop() { // put your main code here, to run repeatedly: // Ever 20 milliseconds, update the servos if (millis() - timer1 >= 20){ timer1 = millis(); move_servos(); husky_lens(); touch_sensor(); run_emotions(); } // Every 10 milliseconds, update the huskylens and touch sensor if (millis() - timer2 >= 10){ timer2 = millis(); communication(); } } // --------------------------------------------------------------------------------- // // ---------------------------------- TOUCH SENSOR --------------------------------- // // --------------------------------------------------------------------------------- // void run_emotions(){ pixels.clear(); switch (emotion) { case NEUTRAL: if (millis() % 5000 < 150) display_eyes(blink1, 125); else if (millis() % 5000 < 300) display_eyes(blink2, 125); else if (millis() % 5000 < 450) display_eyes(blink1, 125); else display_eyes(neutral, 125); if (face_detected) { servo1_target = 90.0 + float(face.xCenter - 160) / 320.00 * -50.00; servo2_target = 90.0 + float(face.yCenter - 120) / 240.00 * 50.00; } break; case HAPPY: display_eyes(happy, 80); servo1_target = 90 + 10.0 * sin(millis() / 500.00); servo2_target = 80 + 15.0 * cos(millis() / 400.00); break; case SAD: display_eyes(sad, 150); servo1_target = 90 + 3.0 * sin(millis() / 400.00); servo2_target = 120 + 20.0 * cos(millis() / 500.00); break; case ANGRY: display_eyes(angry, 0); servo1_target = 90 + 10.0 * sin(millis() / 250.00); servo2_target = 110 + 15.0 * cos(millis() / 175.00); break; case SUPRISED: display_eyes(suprised, 125); servo1_target = 90; servo2_target = 80 + 10.0 * cos(millis() / 500.00); break; } pixels.show(); } // --------------------------------------------------------------------------------- // // ---------------------------------- TOUCH SENSOR --------------------------------- // // --------------------------------------------------------------------------------- // void touch_sensor() { // Read the value of the touch sensor. If it's being touched and wasn't previously, then that means someone just touched it // We can then play the next audio fragment and loop through the emotions bool touch_value = digitalRead(TOUCH_PIN); if (touch_value && !prev_touch_value) { PlayNext(); switch (emotion) { case NEUTRAL: emotion = SUPRISED; break; case SUPRISED: emotion = HAPPY; break; case HAPPY: emotion = ANGRY; break; case ANGRY: emotion = SAD; break; case SAD: emotion = NEUTRAL; break; } } prev_touch_value = touch_value; } // --------------------------------------------------------------------------------- // // -------------------------------------- EYES ------------------------------------- // // --------------------------------------------------------------------------------- // void display_eyes(byte arr[], int hue){ display_eye(arr, hue, true); display_eye(arr, hue, false); } void display_eye(byte arr[], int hue, bool left) { // We will draw a circle on the display // It is a hexagonal matrix, which means we have to do some math to know where each pixel is on the screen int rows[] = {4, 5, 6, 7, 6, 5, 4}; // The matrix has 4, 5, 6, 7, 6, 5, 4 rows. int NUM_COLUMNS = 7; // There are 7 columns int index = (left) ? 0 : 37; // If we draw the left eye, we have to add an offset of 37 (4+5+6+7+6=5+4) for (int i = 0; i < NUM_COLUMNS; i++) { for (int j = 0; j < rows[i]; j++) { int brightness = LED_BRIGHTNESS * bitRead(arr[i], (left) ? rows[i] - 1 - j : j); pixels.setPixelColor(index, pixels.ColorHSV(hue * 256, 255, brightness)); index ++; } } } // --------------------------------------------------------------------------------- // // ----------------------------------- HUSKY LENS ---------------------------------- // // --------------------------------------------------------------------------------- // void husky_lens() { if (!huskylens.request()) {} else if (!huskylens.available()) { // Serial.println(F("No face appears on the screen!")); face_detected = false; } else { // Serial.println(F("###########")); // We loop through all faces received by the HuskyLens. If it's a face that we've learned (ID=1), we will track that face. // If no learned face is on the screen, we take the first face returned (which is the face closest to the center) face_detected = false; int face_index = 0; while (huskylens.available()) { HUSKYLENSResult result = huskylens.read(); if (result.command == COMMAND_RETURN_BLOCK) { // Serial.println(String() + F("Block:xCenter=") + result.xCenter + F(",yCenter=") + result.yCenter + F(",width=") + result.width + F(",height=") + result.height + F(",ID=") + result.ID); if (face_index == 0 || result.ID == 1) face = result; face_index ++; face_detected = true; } } // Serial.println(String() + F("Block:xCenter=") + face.xCenter + F(",yCenter=") + face.yCenter + F(",width=") + face.width + F(",height=") + face.height + F(",ID=") + face.ID); } } // --------------------------------------------------------------------------------- // // ---------------------------------- SERVO MOTORS --------------------------------- // // --------------------------------------------------------------------------------- // void move_servos(){ // We apply some smoothing to the servos and limit the speed // We do this because abrupt movements cause a big spike in current draw // If we are connected to the PC, we use the PC angles. Otherwise, we use the angles from the Arduino float servo1_target_ = (pc_connected) ? servo1_target_pc : servo1_target; float servo2_target_ = (pc_connected) ? servo2_target_pc : servo2_target; if (abs(servo1_target_ - servo1_pos) < 1) { servo1.write(servo1_target_); servo1_pos = servo1_target_; } else { servo1_speed = constrain(constrain(servo1_target_ - servo1_pos, servo1_speed - 0.1, servo1_speed + 0.1), -1.0, 1.0); servo1_pos += servo1_speed; servo1.write(servo1_pos); } if (abs(servo2_target_ - servo2_pos) < 1) { servo2.write(servo2_target_); servo2_pos = servo2_target_; } else { servo2_speed = constrain(constrain(servo2_target_ - servo2_pos, servo2_speed - 0.1, servo2_speed + 0.1), -1.0, 1.0); servo2_pos += servo2_speed; servo2.write(servo2_pos); } } // --------------------------------------------------------------------------------- // // --------------------------------- COMMUNICATION --------------------------------- // // --------------------------------------------------------------------------------- // void communication() { char val = ' '; String data = ""; if (Serial.available()) { do { val = Serial.read(); if (val != -1) data = data + val; } while ( val != -1); } // data is a string of what we received, we will split it into the different values // We receive multiple values from our PC as in "123,abc,123," // We can then split this string and extract the values out. if (data.length() > 0 && data.charAt(data.length() - 1) == ',') { Serial.print(data); pc_connected = true; // Once we get a message from the PC, we turn off the touch sensor and do everything with input from the PC String value; for (int i = 0; data.length() > 0; i++){ value = data.substring(0, data.indexOf(',')); data = data.substring(data.indexOf(',') + 1, data.length()); if (i == 0) servo1_target_pc = value.toInt(); if (i == 1) servo2_target_pc = value.toInt(); if (i == 2) { if (value == "NEUTRAL") emotion = NEUTRAL; if (value == "SUPRISED") emotion = SUPRISED; if (value == "HAPPY") emotion = HAPPY; if (value == "ANGRY") emotion = ANGRY; if (value == "SAD") emotion = SAD; } // If more values are needed, add other lines here, e.g. if (i == 3) ... } } } The code will take input from the HuskyLens and use it to track people’s faces with the servo motors. It will also display different emotions and play sounds depending on the interactions. Feel free to take a few minutes to play around with it. Also, take a look at the code and try to understand what it’s doing. Code has been commented so should be understandable. ===== Processing ===== An Arduino is great at many things but only has a fraction of the processing power of your PC. It would therefore be great if we could connect the Arduino to your PC to be able to code even more complex interactions like speech recognition, more complex image vision, internet-based remote interactions, etc. Luckily, the Arduino can talk to your PC using the Serial protocol, which most coding languages provide. You can connect the Arduino to a Python sketch, NodeJS, C++, etc. For this tutorial, we’ll connect the Arduino to Processing. The Processing IDE has many similarities with the Arduino IDE, so it should be intuitive to use. You can download and install Processing [here]. After downloading, run the following code. This will connect to the Arduino and allow you to change the servo positions and emotion from a GUI: import processing.serial.*; // Import the Processing Serial library Serial port; String[] emotions = {"NEUTRAL", "SUPRISED", "HAPPY", "ANGRY", "SAD"}; // Define the emotions String selected_emotion = "NEUTRAL"; PVector servoPos = new PVector(500,500); void setup(){ size(1000, 1000); print(Serial.list()); port = new Serial (this, Serial.list()[1], 115200); // Connect to the Arduino. Make sure to connect to the right port, for me it was [1] } void draw(){ // Draw a nice GUI with buttons for emotions, and a circle we can move around to control the servos background(0); for (int i = 0; i < emotions.length; i++){ noFill(); stroke(255); strokeWeight(5); float button_x = map(i, 0, emotions.length, 0, width); float button_w = width / emotions.length; if (mouseX > button_x && mouseX < button_x + button_w && mouseY < 100) { fill(255, 100); if (mousePressed) selected_emotion = emotions[i]; } if (selected_emotion == emotions[i]) fill(100, 255, 100, 100); rect(button_x, 0, button_w, 100); noStroke(); textSize(24); fill(255); textAlign(CENTER, CENTER); text(emotions[i], map(i + 0.5, 0, emotions.length, 0, width), 50); } fill(255,0,0); ellipse(servoPos.x, servoPos.y, 20, 20); if (mousePressed && mouseY > 100){ servoPos.set(constrain(mouseX, 0, width), constrain(mouseY, 100, height)); } sendData(); receiveData(); } void sendData() { // Send the Data to the arduino in the format "123,abc,xyz,123,". We can send however many we like. String data = "" + int(map(servoPos.x, 0, width, 0, 180)) + "," + int(map(servoPos.y, 100, height, 0, 180)) + "," + selected_emotion + ","; port.write(data); } void receiveData() { // Receive data from the Arduino, currently the Arduino just sends back what it received int incomingData = 0; String read = ""; while (port.available() > 0) { incomingData = port.read(); if (char(incomingData) != '\n') { read += char(incomingData); } } if (read.length()>0) { println(read); // You can add your own stuff here if you need to obtain values from the Arduino } } ===== Further steps ===== Now that you have a working robot, it’s up to you to add, remove, or change things. In the Grove kit, there are many other modules for you to experiment with. Maybe add a microphone to make the robot respond to sounds, or add a Piezo buzzer to create robot beeps, or display text on the text display, or add the small servo motors to control eyebrows. The possibilities are endless! You’re also welcome to completely change the form-factor of the robot. With some cardboard and tape, you can create any weird robot shapes that you want! For connecting new modules to the Arduino, Google is your best friend. Searching “Grove Arduino Sound Sensor tutorial” will give you many results on how to connect, install needed libraries, and provide some example code to get you started.