Tuesday, December 14, 2010

Final Manufacture

I was most involved in the final manufacture of the design as opposed to the electronics or fabrication of parts.  Alex took some great pictures of the step by step way we put each of the pieces together.  Here it is:
Battery Module
This is the MDF base that Alex routed for the battery module.

 This image shows how the lithium ion battery fits snuggly into the frame.  There is a PIR motion detector in the small rectangular compartment on the right.

 All electrical connections, circuit boards, and wires must be shoved into the base before the sandblasted acrylic can be bolted into place.  When all of the sensors were attached this was nearly impossible to close.  We tried to shorten some of the wires and screw extenders would be helpful.  It may also be possible to make the frame thicker or lose some of the sides so that there is a little more room for the wires.

Sensing Modules:
 These are the snapping mechanisms that we used to keep the suction cups on the modules.

 This is the base that is different from the battery module.  The suction cup is below the center.  There are acrylic holders for the servos on each of the three arms.


This shows the servos in place with the bobbins attached.  The bobbins are made of acrylic and have fishing line wound around them.

This is an image of the transmitting/receiving module all assembled.  The fishing line from the bobbins is tied through holes on the top of the face.  Holes were laser cut out of the acrylic for the fishing line to come through.  When the servo turns the bobbin either lets out or takes in some of the fishing line so that the top face will move according to the instructions that the servos are receiving.  Many of the faces have holes laser cut in them so that fiber optics could be threaded through to create interesting effects with the LEDs that were in the face.

Monday, December 6, 2010

Final Project: What Does It Do?

Wrote this almost a month ago but it never got published:

That is a good question.  Our design is such an open platform we couldn't decide.  We couldn't decide what sensors to use.  We couldn't decide what to make it do or how to react.  No one wanted to commit to any one thing.  This is why we decided to make our array of clusters very multifunctional...we decided not to decide, basically.  We will have clusters of 5 modules.  One will be the communicator and house the Arduinos and batteries and the other five will have different sensors detecting light, sound, distance, and motion.  At different times different sensors can be functioning or the strongest signal relative to another can determine the function.  What I would really like to see, though, is a hierarchy of sensors.  Maybe the light sensor realizes that it is day time, so the motion sensor starts working and once it senses motion the little creatures start wiggling around and then stop if someone gets close or makes noise and close up.

Final Project: Sensor Testing

To try and narrow down what we wanted our design to do we tested many different sensors.  Here are some of the codes from the ones that I tested:

PING Ultrasonic Range finder code modified from: http://www.arduino.cc/en/Tutorial/Ping

  

const int pingPin = 7;

void setup() {
  // initialize serial communication:
  Serial.begin(9600);
}

void loop()
{
  // establish variables for duration of the ping,
  // and the distance result in inches and centimeters:
  long duration, inches, cm;

  // The PING))) is triggered by a HIGH pulse of 2 or more microseconds.
  // Give a short LOW pulse beforehand to ensure a clean HIGH pulse:
  pinMode(pingPin, OUTPUT);
  digitalWrite(pingPin, LOW);
  delayMicroseconds(2);
  digitalWrite(pingPin, HIGH);
  delayMicroseconds(5);
  digitalWrite(pingPin, LOW);

  // The same pin is used to read the signal from the PING))): a HIGH
  // pulse whose duration is the time (in microseconds) from the sending
  // of the ping to the reception of its echo off of an object.
  pinMode(pingPin, INPUT);
  duration = pulseIn(pingPin, HIGH);

  // convert the time into a distance
  inches = microsecondsToInches(duration);
  cm = microsecondsToCentimeters(duration);
 
  Serial.print(inches);
  Serial.print("in, ");
  Serial.print(cm);
  Serial.print("cm");
  Serial.println();
 
  delay(100);
}

long microsecondsToInches(long microseconds)
{

  // See: http://www.parallax.com/dl/docs/prod/acc/28015-PING-v1.3.pdf
  return microseconds / 74 / 2;
}

long microsecondsToCentimeters(long microseconds)
{
  // The speed of sound is 340 m/s or 29 microseconds per centimeter.
  // The ping travels out and back, so to find the distance of the
  // object we take half of the distance travelled.
  return microseconds / 29 / 2;
}


 The range finder works well and was put into our final project.


The following sound detector code simply turns an LED on when sound is detected:



int inputPin = 7;
int val = 0;
int ledPin = 13;
void setup() {
  pinMode(inputPin, INPUT);
  pinMode(ledPin, OUTPUT);
}
void loop() {
  val = digitalRead(inputPin);
  if (val == HIGH) {
    digitalWrite (ledPin, HIGH);
  } else {
    digitalWrite(ledPin, LOW);
  }
}

The sound detector works well for nearby percussive sounds, but does not pick up voices well.  It should work for footsteps on tile floors like the ones in the NCRC where it will be installed, so it will still be used in the final design.

Sunday, November 7, 2010

As of October 28th...

For the week of the 28th of October Design Team 1 our main goal was to understand all of our sensors and tools we had purchased as well as decide what kind of interactions we wanted to have with our surface.

We tested flex sensors that sense flexural movement by detecting a change in resistance.  We tested RF transmitters and receivers...Chris actually made a 3D mouse by creating a graphic on the computer screen that was moving based on an accelerometer that was attached to a remote Arduino.  Jason tested out a mindflex toy which he hacked to show "brain waves" on a computer screen.  There was a little touchscreen that you can draw on and a few others, too.

We split up into three groups of two as we will do for the rest of the semester based on tasks necessary.  I'm still not sure if this is the best way to go because things feel a little disjointed and the communication in the group has never been that great in the first place.  Alex and I were supposed to work on an array, but this was really difficult since we didn't have a form or a function yet.  I did think that this was interesting, but we would want to do something that reacts to the surroundings instead of programmed patterns.

We thought about different ways to make a surface and what we consider a smart surface and we want to build something that uses the RF transmitters and receivers to have an affect on the audience and we want to stick with the moving mechanism of the LFST and are inspired by tube worms as well.  Such as putting two remote surfaces in the NCRC that will respond to each other or do something related to the other surface's space.  We even discussed using our spring mechanisms to make a floor that would respond to people walking on it by making the ceiling move or light up.

At this point I'm apprehensive about not making too many decisions but the group has promised me that it will be better in the end if we just make what works and then tweak things at the end to make a function or make it look better...guess we'll just have to see.

Monday, October 25, 2010

Sketching on a TI84

Apparently more than one person has hooked up their graphing calculator to their Arduino...this video is pretty cool: http://www.youtube.com/watch?v=cMNOGHKQDQk

Sunday, October 24, 2010

Learning Organically

This is what we did on Thursday:
Got on the little orange bus

Went to the Botanical Gardens
There we learned about all the ways that plants kill things, reproduce, and use energy efficiently.  I had never been so interested in a green house before.  I wish we could have taped each explanation of the plants and their mechanisms for world domination.  Here are some of the plants I thought were the most exciting:

Orchids






I was inspired by the way they are pollinated. They direct the bees where to go and
basically make them passively do all the work. It is a two-step process that is very
inefficient, but using sticky surfaces the plants may eventually be pollinated. I think
it would be neat to make an interactive smart surface that was so smart it could make
people act the way it wanted them to. Oh, and they're pretty, too.

Killer Plants

The two pictures above are from two different plants that get nutrients from insects by trapping them.  The bugs get stuck and then just sit and rot until they are broken down enough for the plant to use them for nutrients.  The simplicity and passiveness of this system could be used as inspiration, though for our purposes we probably wouldn't want rotting insects.

Window Plants!

Also known as living rocks, the window plants that inspired the team to use fiber optics in Project 2 were in the green house!  The chlorophyll is actually in a small layer along the edges of the plants and the window on top allows the sun to be penetrating in at all times of the day to reach the chlorophyll.  The plants are also camouflaged as rocks so that predators won't find them.  Maybe our prototype could have a bio-inspired camouflage and have an unrelated purpose...our act biomimetic and look like something inorganic.


Thursday, October 21, 2010

Please Quantify...

We got together this week to define our team and structure.  We seemed to agree on strengths, weaknesses, and our mission but I got stuck on goal writing.  I sometimes wonder why I chose to go in to engineering in the first place (typically at about 2AM when I am still at the library) and talking about our team and project goals reiterated for me that I require numbers with my goals.  I need a way to test a hypotheses or answer a yes or no question to feel comfortable...this may not always be necessary (as my team is still trying to convince me), but it really makes me feel better.