Saturday March 13, 2010

We are the Robots

On Mondays we have a different class, a lecture called Introduction to Creative Technologies. The first lecture defined creativity from several perspectives and explored the history of creativity. It's interesting to note that creativity has changed from being regarded as something spiritual and outside of people, to a talent that was available to only certain people to something within everyone that is easily taught and brought out. My personal view is (put simply) that everyone is capable of creative thought, but for some people it's much more instinctive. Just like any other talent really. We also received an assignment designed to increase our awareness of creative thinking and practice by critically thinking and reflecting upon events - basically keep a review blog. We can review anything that we can define as an event, be it a concert, an exhibition, a film, a performance, a dining experience... Whatever as long as we can pull it off. At least 8 different things over the next 3 months. You will see the link in the sidebar to this blog/assignment.  The blog itself looks incredibly ugly at the moment, I'll be fixing that in due course...

On Tuesday we started to build on knowledge from our first week with 2 short exercises over 2 days followed by a main project set to continue into the following week. In the morning we were introduced to the concept that all games have rules and every element in the game has a protocol to follow. For example, chess has 6 different pieces each with their own protocol such as a pawn can only move forward. We were to divide into groups of 4 and come up with a new chess piece based on a type of character and work out what characteristics that character has (the example we were given was a "The Politician" who'd always sidestep and go back on their word). From there design protocols for the piece to follow based on those characteristics (The politician for example would side-step and move 2 steps back for every step forward). We'd then draw a diagram of it's movements and present it later in the morning. Our group devised the Terrorist, a character that sneaks around and then blows itself and everyone around it up. The protocol was simple. It could only move in a forward direction, 2 units straight but only one sideways. I'm not entirely sure why the group decided that it could only move forward, but I consider it interesting given an article I read in New Scientist a few years ago about the psychology of a terrorist and the people surrounding one in the days/months leading up to a planned attack. There basically is no going back because of the social pressure and expectation placed upon them (although it really amounts to nothing but manipulation by the superiors).

When an opponent jumped the terrorist a roll of a dice would determine how many immediately surrounding elements (including friendlies) would be taken from the game. I quietly wasn't keen on a terrorist at first (though I didn't speak up), but I thought the translation into an actual game piece was rather well done in some respects, and quite different.

Once we presented the piece, the robots came out. Lego Mindstorms is a lego set (obviously) that also comes with various sensors, 3 actuators (motors), and a controller unit. Hmmm, something familiar about this... It also comes with a visual programming language. When I say visual, I mean that commands are drag and drop blocks on the UI. Very visual, very basic (Not BASIC). Our next task in the same groups was the build a tribot (3 wheels) that replicated the protocols of the chess piece we made. It was also not to bump into all the other robots other teams would be making, tomorrow when we placed them all into the same 2m x 2m square at 10am the next morning. The other three members were very intent on building a lego robot so I decided to get familiar with the programming.

The next morning (Wednesday) the rules had changed slightly but the main idea was for the robots to not bump into each other or they were disqualified. I'm not entirely sure what happened with our robot, it seemed to get caught in a pile up of robots. It had been programmed to stop and manoeuvre around obstacles (at 45 degree angles like the chess piece), but we could not program it to back up as the terrorist cannot go backwards. We could not control robots running into us. Our next challenge, in the same team was to rebuild the robot and program it to find it's way from the outside entrance of the BCT class through the corridor and to the administrator's office where it would knock on the door and deliver a message. Just how we would achieve this was up to us.

The Mindstorms robot comes with an Ultrasonic sensor which detects objects and their distance. This can be used as a crude form of vision. A microphone that detects sound (obviously) and it's volume level. A light sensor, that detects the intensity of the light. That sensor also has a light of it's own shielded from the sensor. It can be used to bounce light off surfaces. It also comes with a push-button switch sensor. There are other sensors available though the Mindstorms website, but that is what is available in the basic package. We could make use of whatever resources were available, and we quickly found a program that used the light sensor to follow a line on the floor, and we reversed one comparison made in the code so that instead of following a black line on a white floor, it would follow a while line on a dark floor (such as BCT's carpet), and we laid paper masking tape down.

There were 3 versions of the same program available to us, each one more complex but faster than the last. We stuck to the basic program because the UI of Mindstorms frankly needs some work. Basics like scroll bars are missing. There is a hand tool, but as yet I'm unaware of the shortcut key to switch quickly between the hand and the selection tool, making navigating large programs clunky. Reverse engineering the most complicated program was out of the question, particularly as the group had a working robot and were weary of improving it and potentially breaking it (I'm a bit more gung ho in my approach to things which is why I find collaboration challenging at times). However, it was nice to have a working robot fairly early in the day, other teams looked like they were struggling a great deal.

Thursday morning at 10am our groups demonstrated our robots. A number of teams had simply programmed coordinates into their robots so that they didn't really sense anything. This tactic relied heavily on the placement of the robot at the beginning and it turned out that traction on the carpet was an inconsistent variable. This was not a successful method. Our team and one other who we shared the code with has robots that tracked the line, and they made it to the end (albeit slowly). But the team that impressed me the most had made use of 2 light sensors on each side of their robot which allowed it to run at full speed slowing only to correct when one sensor hit the line. See our approach had the robot detecting one edge of the line and making minute adjustments to stick to it. Their robot drove in a general direction and only adjusted if it got too off track. They had also made the program themselves. It would have been simpler yet more effective. This quietly annoyed me a little bit as I'd probably have done this myself if left to my own devices. Anyway at least one member in that team was also responsible for the Powerpoint slide show last week that looked really good (and despite being Powerpoint to boot) so we have some pretty smart cookies to watch.

We were then given our assignment. In the same groups - possibly merge with another group - conceptualise, design and create a robot that draws based on sensory input. Drawing does not necessarily mean pen and paper, but I had a bit of difficulty at first grasping what else it could mean. Our group merged with another group and we all decided that it would be best if we went home and came up with ideas to present to one another the following day. Suited me, I had lots of life-admin to do, I cannot wait until things have settled for me on a personal basis (a few weeks off). I struggled with thinking about the mechanics and limitations of the robots, and more of the limitations of the programming language. It has loops and conditions, but as far as I can see it cannot branch out to other parts of the program. I did not have the language installed at home to investigate though and could not find a copy of it on line (I'm pretty sure that its open source and I imagine free as it's the hardware that lego is selling really). But I considered ideas such as a plotter and then perhaps connecting it to a Etch-a-Sketch as that has 2 dials with which to draw.

Friday morning I came in to find our group had detached from the other group but reattached to a different group. oookay... Not entirely sure about the events leading up to that but it was no big deal. Just... odd. A few ideas had been presented, and for the first time I was feeling quite content to be in a group as I felt a bit stumped with ideas at first. One member of our group had considered the use of lasers in some sort of light show which at first I thought it was a bit too performance oriented and too abstract for my liking, but then I saw a Photoshop doodling fall out of another member's visual diary and it was very colourful and there as something inspiring about it for me. Suddenly I realised how one can draw without the use of paint, pencils, felts, pens and 2D media such as paper. In a pretty decent collaboration of thoughts from various members we decided to use 2 robots, one would do the sensing and transmit instructions based on that input to the second robot that would execute the drawing. The drawing would be the use of laser pointers, prisms, mirrors, lights and whatever else we could think of to project an image onto a wall that would then be recorded as both a performance but also as a single piece of art by a still camera set to a continuous long exposure. We could produce potentially beautiful artwork. Both creative, technical and what appealed to me most of all is that it was ambitious. I usually find groups to lack ambition - it was very very pleasant to see a group striving for great goal without worrying about whether it can really be done. Of course it can be done, anything can be done when people stop questioning every single thing. The rise of the personal computer industry was precisely because the people that bought it about were not aware that "it could not be done".

We divided into 2 teams, one building the sensing robot, the other building the robot that executes. The sensing robot is of course a lot easier, and our team of 2 people had it done by midday. The next task is to program it. My feeling is that the sensors should interact with each other in some way mathematically to produce more interesting instructions for the other machine. Of course I have no idea how to do this and while I'm aware of what an algorithm is, I don't have the faintest idea on how to produce one. But I'll learn. The other team have not had as much success yet and current designs are a little bit unsettling so once our team has some code under way perhaps we will rejoin and see if we can also contribute. It is a complex task, especially as we have only 3 motors to control a number of lasers and prisms etc. Perhaps we could try to get a hold of a third robot...

    Sunday March 21, 2010

Lights! Camera! Action!

I'm going to try the following format this week....


Introduction to Creative Technologies, we had an incredibly interesting lecture from an industry insider on 3D printing technologies. Summarising, there are a few different processes, some superior to others, but all with a niche of one kind or another, in the last couple of years the materials that can be used include metals, plastics, textiles and even bio/organic materials. One company in the States has been building spare body parts (in the example we were shown, new bladders) for patients from cells grown from the patient's own stem cells, so that when the new part is implanted, there is no rejection. Experimental technologies included house building, and printing at nano scale. It became apparent of the possibility of a major disruption to the economy as the economy of this method of construction could displace many jobs.

Well this has happened may times in the past, and sometimes the shift is very dramatic. From my own experience (although not direct thankfully, I entered the industry after the most major  changes) the Apple Mac and Adobe Postscript, page layout software, then CTP (Computer-to-Plate) and digital photography made loads of professions fairly worthless. Some people are scared of such change, I think if you are then you are more likely to wind up on the lower end of the socio-economic scale when your job gets displaced by new technology. Learn to deal with it.

The guest lecturer was however under the impression that the ability to print in 3D was a step on the road to more leisure time and 30-hour weeks. I'm no economics wiz but I don't think it works like that. If anything we have become busier in the last 30 years with the advent of computers for example. There are other factors involved too of course but what seems to happen is more useless crap gets produced for lower margins and the operators are expected to work faster and smarter to keep up. I don't see that changing until our culture and attitudes change.

Back in the Studio our team had found lasers on Trademe, and had made a deal to get 3 for $100, which amounts $14 per group member. This was a deal so when it comes time to sell them we stand to get all our money back. We're poor students. The only concern to me was that they are all green and we could have paid $100 and got one blue and one green, buying a lower power red for $5. But the team believed the lower power of the red would not show up at all and while I'm not so sure, I have no hard data and we can't test it, so three green lasers it was. Two members headed off to pick them up and the rest of us planned a few details on the robot. I tried to work out the maths behind the sensors on the sensing robot. As stated before I want the software to process the input and churn out numbers based on that processing, and interaction of the sensors, rather than just a direct input to output. I found it a bit difficult to do in the studio so I finished it at home. I now have a lovely scribble flow diagram. Tomorrow I will translate this into actual "code" and also see where everyone else is at. I'm still not entirely sure about the actuating robot design but tomorrow things should become clear for everyone as we will have all of our materials. I'm also a bit worried about how we are going to transmit from one robot to the other. The NXT computers have a bluetooth radio, but we've never used it, one group member believes we will be able to ask the 2nd year students, I'm not sure they will be much help or know much more than I can find out myself, but for now I'm going to be optimistic.


(Written on Wednesday) On Tuesday we reconvened and I attempted to turn the diagram in my book into code. The actuator robot had changed it's design due to one member having similar issues with the previous design too, so it looked more hopeful. But as I've become the coder  for this project I haven't stuck my nose too much into the robot progress. I took the work home later in the day but started to consider that the new design didn't look too much like what I thought we'd all agreed on and I might wind up coding movements that are not possible for the robot that has been designed. Hmmm...


I came in this morning and there was a finished-looking actuator robot. It looked more appropriate in some ways but the new design had not taken into account that now it's movement was limited and I'd have to somehow program that in, and Lego Mindstorms is not exactly a great "programming" environment. It's slow and clunky and lacks simple UI elements such as scroll bars leaving you with a hand tool (and no keyboard shortcut) to move with. However, I decided to not worry about that yet as I had still to get the maths done to turn the sensor's inputs into outputs... I'd deal with how the outputs were interpreted at the other end later. I still had to work out how to send those outputs via bluetooth to the other robot. We did however get to try out laser light waved at a wall photographed by a long exposure camera, and the results looked promising.

Laser through Lense

Laser Scribble

I spent the afternoon "writing" the code for the sensing robot, turning the maths into little programming blocks. It takes a while to get used to. Some of things I'm trying to do seem convoluted in NXT but would be fairly basic in a real language. It makes me wonder if the compiled code that NXT produces is  bloated and slow. An example of a basic operation would be to take the inputs of 2 sensors, add them together then divide by the input of a third sensor in order to give a number used to control the speed of one of the motors. Basically I'm making up equations based on the inputs to come up with numbers for various parameters of the output instructions. One difficult one was converting distance from the ultrasonic sensor into a number from 1 to 4 depending on a certain range. I wrote 2 versions of the same code - I liked the 1st version better because it looked cleaner and more skilful (and probably executed faster) but the second version, which had comparison loops within comparison loops, I could be more sure it was actually doing what I thought I'd told it to do. It's quite difficult to test the code, you can output the result to a screen on the NXT but I found that difficult to get working properly. So I'm working kind of blind, and hoping my logic is solid.

I'm a bit concerned that my group sits in the lab with me and mostly slows my concentration. If I ever do this again I'm going to make a flow diagram with their input to the whole thing. It seems a bit silly to have one person know what's going on with the code while everyone else sits there bored. Plus I'm so busy with the coding that I haven't been able to make suggestions about the actuating robot. I think the design is mildly flawed, and not what I thought we originally agreed on.

I stayed until quite late with one member who was stuck there until 9pm. I took the robot home and mused that it would be my chance to fix the design, but there are a few good and obvious reasons not to: It's not my place to change things without group input. This project is not assessed but a learning experience, and the group needs a chance to learn from mistakes and reflect upon improvements. I don't need to be emotionally involved in it, but this does illustrate why I don't usually like working in groups. I mentioned this in a previous post but took it out later fearing that it wasn't the most diplomatic thing to say on a blog classmates can read. Put simply, I've never worked in a group, even in a commercial/work situation where the result is more satisfying than what I could have done on my own. This may sound like "Chris can't work in teams" but I can, and do. But that is my life-long experience of groups. And in BCT I'm somewhat older and more experienced but trying to avoid the "older student drawing on life experiences" stereotype, so it can be frustrating when I see some of the paths we are going down. Right, unprofessional rant over. Another reason why I didn't alter the robot is because I am tired and, I don't have any resources available to me right now. This is as frustrating as hell because in past lives I've had money, tools and materials and a space to work in, at the moment I have none of that, plus  all the shops are shut anyway. Off to bed.

So it was getting a bit urgent now becausea demonstration is meant to happen tomorrow. First, let me explain the design flaw (in my opinion). While it's a better design than previous, we seem to have discarded the ideas of using prisms and mirrors to direct the laser light into the wall, instead mounting the lasers directly to the robot actuators. This limits the range of freedom available to the motors which makes it harder to program and sort of also means some of my code is redundant while I have to re-write other parts. Some of it I'm not sure how to write, in such a short time. While the robot can detect motor rotation, it only knows the rotation from where it last was, it's not an absolute number. This means if the starting position is different the robot could wave the lasers off course or arms could crash into each other. I also think it reduces the artistic potential. Take a look at any laser in a night club and you will see that the laser is static and it's mirrors in front of the laser than alter the course of the light. So I pointed this out at a meeting in the morning and was met with scepticism so I demonstrated (I found having to demonstrate what seemed obvious to me a bit painful). There was still resistance but I went in search of reflecting objects in the city. While out, the battery for the test laser failed and I had trouble finding the right objects. Bare in mind the demonstration is tomorrow and this is stuff I hoped the rest of the group would have done earlier. I had also run a test program for the bluetooth and while it worked I'm not sure that it's really all that stable or workable for the project. While I was gone, the group had started coming up with alternative strategies, which I found very pleasing as the weight of success being on my shoulders while given limitations I was unhappy with was getting to be burdensome. I was happy to drop the bluetooth completely as an unnecessary complexity especially as the new design had 2 receiving robots and I'm not sure how that would work given that there is what's called hand-shaking involved with bluetooth. That is, 2 bluetooth devices have to connect to each other and agree to talk, it's not just a case of one machine sending out signals like a radio station and nearby devices just picking them up. No bluetooth did present the problem of what the robot would be sensing now though. So it was also decided that the actuating robots would now sense sound and basically dance to music. Another member wrote a basic program while I was gone and tested it and it worked. This was great, although I offered to submit parts of my code that processed the input to add some dimension to the program but it was declined to keep it simple. At this point I didn't really mind, but would like to point out that it was the bluetooth that failed, not the algorithms and they could have made the code and possibly the dance more interesting, but I can can understand that people do not want to know about anything to do with a certain approach once part of it fails them.

But we now had a working machine, not as grand as the original idea but I think everyone was over it, especially me. I also wanted to distance myself from the project a little bit now.

We waited until later in the night to record on video the team (or part of the team, 3 members had other commitments) setting up the robot and capturing our images. We first tried in the computer lab, then moved to the studio where we recorded our first images. We then went into the city to find a wall to project to. I think the aim was to get video of people reacting. The problem was that it was far too light for a long exposure camera. I'm not a photographer, so my call that "it's too light" may have not carried much weight but I thought this was kind of obvious from the beginning but nevertheless we spent a couple of hours mucking around capturing shots.

From there we went back into the AUT building and into the stairwell. It was a lot darker, a member of our group wanted to aim the lasers at nothing in particular, just the staircase. I wasn't certain about this either but I was quietly getting irritable as it was about 9pm. However this method proved to produce the best shots of the night, so it was rather inspired. The effect was more 3 dimensional than projecting against a wall and this is what he had planned. It worked.  This is the same team member who also came up with the alternative to bluetooth, sound-controlled idea so he'd really saved the day a couple of times.

As soon as it was over though I wanted to go home so I did.

Laserbot side view

This is the day of the presentation, and we got to see what other groups had come up with. I was quite impressed with all of the robots, although an awful lot of them had microphones attached and sensed sound which was a bit of pain given that we were also doing that now. I think sound was the most obvious choice as light sensing works best when there is some degree of mobility. Touch sensing is just an on-off switch and ultrasonic sensing also assumes either mobility or lots of movement around the robot. With sound you can crank up a stereo and say it's responding to the music (even if it is only doing so very crudely).

The first group had a robot that moved back and forth above a tank of water dropping food colouring to music, while a camera focused on the tank and recorded. It was quite nice and tranquil, especially as they weren't playing some terrible obnoxious commercial pop, which... well... nevermind..

Another group had 2 robots, one drew with chalk, charcoal or crayon while the other had an eraser, they sensed each other and the wall that enclosed them. The artwork was kind of appealing.

Another group had a robot sensing strong light and music controlling a radio controlled car that had paint squeezed from it, making tyre tracks on the large canvas.

Other ideas included a robot that sensed the edge of the paper and drew dotted lines. As more lines were drawn the robot could also sense those and react, so it was reacting to it's own artwork. A robot that sensed music and... I'm not sure how it worked, it apparently controlled a mouse and was drawing lines in Photoshop. Another drew circles with salt, and the last also used a long exposure camera but threaded wool from the edges of a box that had wire catchers attached, like some sort of loom. They introduced it with a commentary about people feeling threatened by robots talking over their industries. I thought it interesting that they had done this, and made a machine that threaded wool but yet not known to reference Luddites.  Luddites were people who worked in the textile industry in the 19th century during the industrial revolution who protested (sometimes violently) against new automated looms that represented an end to their jobs. These days the term is used to disparagingly refer to people who reject new technology simply because they don't want to learn it. I've met a few in my time. Anyway, interesting parallel, even more interesting that it was accidental.

Our group presentation was an utter shambles. I had backed off a little bit because by this time I felt that not much of what I had said had gotten through and I'd sort of washed my hands of it. Perhaps not the best approach but I take a certain amount of pride in my work and my abilities and I don't take kindly to being credited with the work of others, whether it's good or bad.

There were technical difficulties that could have been overcome with preparation, and perhaps some of that preparation could have been made had we not been making our pictures last thing last night. But it was good in the sense that everyone probably learnt how not to present.

After that we were to work out putting the video footage together, which I really wasn't looking forward to. I'm a bit tired of group-think and taking 10 minutes to make any decision because it needs to be discussed and run by everyone who may or may not object, or someone saying after a decision has finally been made "I have an idea". I had to pick up something so I left a USB key containing the source footage with the group and quietly hoped that indecision would mean lack of progress which I could then fix when I got back. I'm not sure if anyone else knows how to edit video, and I felt a strong need to selvage the whole project with a semi-decent looking video  presentation. I got back and everyone had gone home, not sure what decisions had been made, but it seemed that none had been. So I spend this weekend producing most of the video. Most, because no one gave me the photos that need to be inserted. Which means some painful group/studio time tomorrow adding the bits that need to be added... with a lengthy process of consultation with individuals who have never worked in a commercial environment or have an eye for what looks professional before but still have equal say. Ugh. Not that my movie is completely up to my own standard anyway, I haven't actually used my video camera much nor shot footage for a long time and I need practise, the camera work was terrible on a couple of different levels. But it's probably quite passable for a 1st assignment for a 1st year student.

So this post probably looks a bit negative, and I am glad it's over to a degree. I find group work tedious and frustrating but it's a skill I have to learn in order to succeed in BCT. At the moment I see groups as an obstacle to producing exceptional work, I need to learn how to leverage a group's potential instead. There were things on reflection that we could have done a lot better. First, testing the bluetooth thoroughly before we embarked on the mission might have been a very very good idea, instead of waiting until we got to it. Some more research online about the technology in NXT would have gone down well. In fact, in my opinion, research was the one thing that lacked through out the entire project. We are now academics, and research is crucial. It's funny actually, as a worker I researched things all the time before I went out and did them. Why this approach was not taken this time is a bit confounding. We have more resources available to us now, not just google which can produce pretty useless or incorrect results sometimes anyway. Although I only went to a class on how to use the AUT Library resources on Thursday during downtime, it did suddenly occur to me what we weren't testing our ideas. Had we done this there would have been less arguing about what we think is the correct approach. We could back up any thoughts with hard data. The team was friendly and I liked them on a personal level. They were pretty open to new ideas to a degree, yet not open to execution of small details that became pretty important. I think that more planning would have meant less work. Another big thing missing was communication. We should have been able to communicate with each other while at home through social networks. At the very least through AUT portals such as our AUT email. Although email, especially the cryptic addresses and webmail logins might have meant the email not utilised properly. Sure everyone has msn or google talk or something similar. Perhaps if everyone set up an AIM address, that way there would be separation of our personal social networks and our AUT ones (for those who don't want to bring their work into their social lives - I'd be one of them). The changing of direction of a project without consultation was a major issue and I'm not sure how everybody missed the significance of a change in robot design being important to the programming. We should have got the whiteboard out, planned, researched and drawn diagrams. The coding should have been everyone's responsibility. I'm not saying everyone should have coded, there should be one or 2 dedicated coders, but everyone could have developed the logic. We are all BCT students so no one can wave their hands in surrender and say "I don't know about this stuff" and switch off, like an office worker who is not interested in an explanation as to why their outlook mailbox is not receiving any more emails and how to avoid it in the future.

Being close to the project I can see it's flaws. But we DID produce artwork and we did get the job done. To an outsider (aside from the presentation) we probably did okay.