Monday, November 18, 2013

Soundsensor

Soundsensor

Below is a very brief test of the NXT soundsensor in the new sound-framework for EV3.

During my test I was quietly humming, which is seen as the small spikes leading up to the big spike, which was when I clapped about 10 cm in front of the sensor. 
DB mode
I tried to continue my humming at the same level, but in retrospect I did move a bit closer to the sensor, and my volume might have increased somewhat. The clap was controlled to somewhat the same loudness.
DBA mode
Not much to say about the results. They are somewhat the same as expected, though the values makes less then perfect sense (what is 0.2 in relation to DB? - what is actually returned is the soundvalue in relating to the maximum possible recording soundlevel of the sensor).

Compass Sensor

EV3 and HiTechnic Compass

Today I'll test if the HiTechnic Compass works as intended with the new EV3. My expectations are that the calibrations will be difficulat (as always), but that the rest should work.

Test setup

First off I'll place the compass sensor about 10 cm above the brick (which is then even further away from the motors). This is to limit the amount of electromagnetic noise registered from these actuators and circuit board. The configuration can be seen her:
Distance of compass from the rest of the robot
I've found, in previous tests with the NXT, that the proximity of actuators and/or the brick itself has a significant impact on the readings of the compass. 10 cm should be quite enough to limit the interference to a negligible amount.

Results

I had to bring the setup out of my appratment, as the walls seem to shield me from the earths magnetic field (hmm...), but once I found an environment where the compass had a chance of working, everything turned out great. Below is a video of my test:


Since I got the compass working at least to the same degree that I got it working for the NXT, I'd say this concludes my tests: EV3 supports this compass to the same degree that the NXT did.
As for the video. The first part is a calibration of the compass. The second part is my attempt to film the changing degrees - I see now that the video is of a too poor quality to show this, but I can assure you that the degrees changed according to what I expected while I turned on the spot.

Files

Friday, November 15, 2013

Gyrosensors

HiTech vs. EV3 Gyrosensor

In this post I will examine how the "old" HiTech Gyrosensor compares to the EV3 Gyrosensor . Seeing as the HiTech Gyrosensor detacts degrees/s and the EV3-sensor allegedly returns some sort of accumulation of degrees (used to make 90 degree turns, according to the LEGO tutorial), I'm unsure how exactly to compare the two. During my examination of how the hardware is used by the people behind leJOS, I'll hopefuly get a good idea of how to proceed.

Finding the old sensor in the new framework

For me this turned out to be a bit of a hunt. As it turns out all the NXT compatible sensors are organized in software packages based upon which producer manufactured them. I spent some time trying to make the GyroSensor class work until I found HiTechnicGyro. Source-code is linked below.

leJOS modes - how a sensor with multiple functions is treated

The EV3 sensor is able to give degree/s OR accumulated degrees. In the new framework this is reflected in each sensor having multiple sensorModes. For the EV3GyroSensor the available modes are:
  1. Rates (degrees/s)
  2. Angles (accumulated degrees)
After choosing a given mode for a sensor, a sampleprovider is given. This sampleprovider is then able to 'fetch' the data.
It should be noted, that this sampleProvider framework is pervasive for all sensors AND that it is possible (but unpractical) to get all possible sampleProviders from a given sensor, at any given time (for the ev3Gyro: both Rates and Angles). This adds the "hidden" cost of changing the sensors readmode between fetching data from the providers. I tested with alternating between Rates and Angles and found it gave an added latency of about 400ms.
Having experimented with how to misuse the framework, I moved on to actually trying to solve my experiment.

Experiment

First I want to compare how the to sensors performed in the same 'mode'. Below is a video which illustrates how I move the two sensors in relation to each other (ei. how I've built the device for testing). It is NOT, however, a video of my actual test-data. I only rotate the two sensors on the axis they're meant to register for the actual test.


The above shows, from all possible angles, how the setup is constructed. The below data is then collected through measured movements.

EV3 in rates mode (degrees/s)
First 90 degrees counter-clockwise, then 90 degrees counter-clockwise, then 180 degrees clockwise, then 90 degrees clockwise and lastly, 90 degrees counter-clockwise.

EV3 in Angles mode (accumulated degrees)
I see a rising degree of inaccuracy in the Angles mode, which isn't too surprising considering I'm never 'resetting' the accumulation. The first 180 degrees where more or less exactly they way I actually moved the sensor, though (fairly slowly).
The reason I didn't include videos of the above experiments was, quite simply, that I needed both hands to control the gadget because of the tangling wires.

Experiments with breaking the framework

Below is a graph of how badly the EV3 Sensor seems to perform in relation to the HiTechnic Gyro, when both modes interchange when to fetch from the hardware:
EV3 in BOTH Rates and Angles mode
It's fairly clear that this messes up the readings a lot, when compared to how equal the graph is for only rates mode.

Files




Wednesday, November 13, 2013

UltraSonic

Comparison of NXT and EV3 UltraSonic Sensors

Today I'll examine these two sensors to see if there is any difference. I suspect that they will perform the same, but my overall goal is to examine alle aspects of all sensors of the EV3. Since I've worked with the NXT Sensors before, I'm using them as a benchmark for how well the newer EV3 sensors perform.

Test inspiration

The next couple of blogposts will be related to this other blog I helped create. The ultrasonic sensors where tested duing week3 of the above experimentations. When I refer to weeknumbers hereafter, I'm referencing to this afformentioned blog.
Week3 was a week dedicated to understanding how ultrasonic sensors work. My goal with the below experiment is not to recreate the expeiment exactly as was, but rather to test if the experiment is feasible with the new EV3 using leJOS. While I'm doing this, I'm sure some sort of picture of how well the new hardware performs will emerge.

The experiment

My test-setup is as depicted below:

Setup for the distance test
The plastic top of the LEGO-box was selected, as we found (in our privious tests), that measuring the distance to some non-vertical thin plastic object posed a significant challenge to the NXT ultrasonic sensor.

Ultrasonic Distance Measurements
Actual distance*
NXT
EV3
45
48
0,475
80
87
0,834
97
108
1,115
110
inf
1,206
120
inf
1,315
137
inf
1,499
>137
inf
inf
* based upon eyemeasure of where the sensor is relative to the measuring tape

I tried my best to make sure that all measurements where at exactly the same distance and I feel confident that they where all done within 1cm of actual distance (to the bottom of the angled slope of the plastic).
First off I suspected that the problem with the big difference of accuracy was because my NXT sensor was damaged/used, so I tried another one, with the same results. This leads me to conclude, that the new EV3 Ultrasonic sensor is actually quite a bit better at detecting difficult objects.

Second test

I did implement the WallFollower mentioned on the other blog, but the result was just as bad as in our first attempt (I used the exact same code, corrected for the new sensor-framework). These results are not included, but I wanted to confirm that it is indeed possible to make a wallfollower (my implementation was kind of deranged, but it kind of followed a wall).


Monday, November 11, 2013

Lightsensors

Overview

Today I'm going to experiment with the old and new light-sensors on my EV3 brick. I'll do, roughly, the same exercise as I've previously done in this post on another blog, to see if I can recreate that result in the leJOS-EV3 framework.

Initial test

First off I'll simply try to read the light-intensity values from an array of differently colored LEGO-blocks. I've decided to slide the blocks along below the sensor, so I can leave it fixed and untouched between color-switches.

Without Floodlight


NXT-Light
NXT-Color
EV3-Color
Old NXT-Light Readings
White
0,4422
,8436
0,17
31
Yellow
0,4237
,8436
0,15
29
Red
0,3973
...
0,11
30
Turquoise
0,3426
Sensor borken?
0,065
22
Green
0,3143


0,04
23
Purple
0,3758


0,09
29
Blue
0,3495


0,07
25
Light-grey
0,3534


0,075
25
Dark-grey
0,3074


0,035
23
Black
0,2547


0,02
18
Diff from White to Black
1:1,74
??
1:8,5
1:1,72

Different values, but the same deviation from white to black for the two NXT-Light readings. The interesting part is, that the deviation for ambient-light of the EV3-Color sensor, is actually quite a bit better. I'm a bit put off but the lack of accuracy on the sensor-reportings, though.

Now with the floodlight (red LED-light) enabled:
Floodlight (red)


NXT-Light
EV3-Color
Old NXT-Light readings
White
0,6376
0,665
60
Yellow
0,6259
0,59
60
Red
0,6122
0,41
54
Turquoise
0,4462
0,075
33
Green
0,4911
0,05
37
Purple
0,5574
0,21
47
Blue
0,5057
0,05
33
Light-grey
0,5595
0,22
48
Dark-grey
0,5116
0,11
42
Black
0,4813
0,04
35
Diff from White to Black
1:1,32
1:16,625
1:1,71

Here, the EV3-Color sensor actually performs much better at detecting the difference of White and Black. Most of the tested colors are between 0.11 and 0.4, though. This leaves most of the colors in with a fairly small deviation of 1:2,75 (which is still a lot better then the old sensor).
Suspected difference: Old lightsensor is approximately 4 mm closer to the surface I'm reading, at a distance of 5 mm, whereas the new sensor is at 9 mm distance.


EV3 Color Sensor
NXT Light Sensor









Now for a test where I'll ensure that the distance from the sensor-head is equal (5mm)...
EV3-Color

Without Floodlight
Floodlight (red)
White
0,11
1,05
Yellow
0,07
0,97
Red
0,05
0,66
Turquoise
0,04
0,13
Green
0,02
0,095
Purple
0,05
0,34
Blue
0,04
0,14
Light-grey
0,04
0,41
Dark-grey
0,02
0,21
Black
0,01
0,19
Diff from White to Black
1:11
1:5,53


Hmm, even worse... Now the scale is broken for the White value, which can only be defined as BAD. I checked up on the implementation-notes in-code: Values should be in the range [0.0;1.0], so I suspect these readings may be misleading :\

Saturday, October 19, 2013

First leJOS EV3 program

First leJOS EV3 program

Okay, so this section isn't going to start off with a working example. But that's only because I've got something much more important to write about: A quick-fix for the ever-present microSD-card problem.

Way to get the microSD-card out again!

The original solution is found here, so all credit should go to JGeo.

sdCard Bottom
sdCard Top

Eventually I came to the conclusion that I'm not going to extract the card all that often, though. Transferring files via the wireless setup is actually pretty easy.

sdCard inserted in EV3


Actual testing

First off I tried to simply cram the required packages and classes into a shared project in order to force-feed the whole project to the EV3. In my mind it was a "quick" solution to test out the basics. In one word: Don't. I ended up wasting more time then I care to admit here.
After following the guide, I was able to actually make my EV3-leJOS work again (I broke it with my "quick" test). Setting up the development environment was pretty trivial. What did take some time, however, was upgrading my distribution of Ubuntu. Again, don't do that. 13.10 messes up all menu-related tasks for any java-program; hence my IDE (eclipse) was unusable. To keep this part of the rant to a minimal, this can be fixed with the following command (which disables the Unity-menu, and instead uses the built-in java menu for eclipse):
  1. Open up a terminal
  2. Write: "export UBUNTU_MENUPROXY=0" (this will disable said menu for anything launched from this terminal in this session. Returns to normal for any other terminal)
  3. Run eclipse: "eclipse"
Now, having fought myself though the literal jungle of obstruction in a single day, I was somewhat annoyed to get a NoClassDefFoundError when following the development guide section of the wiki. This error was mostly due to me needing a cup of coffee and the fact that I forgot to include the ev3classes project when exporting my project to the jar-file. So remember to both include the ev3classes for compile-time AND for export-time. The IDE is not smart enough to figure what might be available in a later execution environment.

Enough talk of failures, here is my amazing driving robot!


... okay, so not that impressive. But it's a running robot driven by leJOS on EV3, which was my goal all along. As far as I can see, the leJOS team seems to have ported most of the old framework (if not all?), and that at some of the sensors seem to be supported already.
Seeing as the sensors seem to be very similar to the old NXT-sensors, I would expect that most of them work. More experiments on that later on.

Saturday, October 12, 2013

Experimentations with leJOS

Experimentations with leJOS

This is a very, very brief description of my experimentation with leJOS; getting it up and running and executing the initial precompiled Hello World sample-program.

First off, I simply followed the guide located here. Below will be comments of where I deviated from the descriptions of the guide, but I recommend you take a look at the above link, as I will reference to it directly in the below text.

Hardware/software

Below is a list of hardware and software I used to get leJOS up and running on my brick-labtop setup.
  1. Labtop running Ubuntu 13.04
  2. Brick Hardware: V0.60
  3. Brick Software: V1.03E
  4. Wifi-Dongle: Netgear N150 (WNA 1100)
  5. Micro-SD Card: ScanDisk Ultra microSDHC 16 GB Android
  6. Card Reader/writer: Transcend RDF5K cardreader SDHC/XC/UHS1-
  7. Wireless router: Cisco Linksys E4200
A quick run-though of my thoughts behind each of these seem to be in order. Number 1. Ubuntu for my labtop: While it is, according to the guide, possible to simply run Linux emulated/virtually, I thought that this would simply add another layer of possible failure to my initial concerns regarding the whole operation. I see no real problem with emulating the software instead of switching operation system, but on the whole, I prefer to work with what is "natively" thought for a given software-solution. I'll continue to test all the software from LEGO on a separate Windows 8 machine, if needed.
I updated my brick fairly recently and the hardware version is simply what the brick informs me, when looking at the "brick information" menu-item of the LEGO-software interface.
The dongle was more or less an accident. I got it from somebody else who was planning to experiment with EV3, but then decided against it after purchasing the dongle. Accidentally (maybe?) it's exactly the same dongle recommended by the guide, and the only explicit model supported by the brick (though all other modes with the same chip-set should be supported as well).
The wireless router was simply what I had available to me at home. My network was set up with a WPA2 authentication, which to my pleasant surprise, was exactly the same Andy used for his guide. Nice.
The card reader was practical, and more importantly, actually in stock when I purchased the micro-SD card.
Lastly I get to the microSD card itself. This one actually took a bit of thought. The guide simply suggests any card of at least 2 Gb capacity. It says nothing about how this memory is going to be used by the brick or how much space is actually used for the leJOS system files. So more about that below!

microSD Card after leJOS
  1. 2 Partitions:
    -> Partition 1: 2 Mb of data (100% filled partition)
    -> Partition 2: ~250 Mb of data (rest of the unused space of the card)
  2. All leJOS programs are located on the card, and run from the card.
Looking at the above I'm fairly happy that I've chosen to invest in a faster-then needed card. Each microSD card has a certain classification describing read/write speed. I've chosen a card that is UHS Class 1 / Class10 (~10 Mb/s). This basically means that I can expect to be able to get read/write speeds that are better then if I had chosen a Class2 (~2 Mb/s). I have no idea how much, if at all, this is going to affect my experience with the brick, as I haven't found any documentation of how fast the brick is at reading from the card. More investigation is needed, but I have a feeling that this might be fairly relevant to the performance of the setup.

Setting up leJOS/testing

Simply following the installation-guide was enough for me. I did notice, though, that when the leJOS "welcome screen" was up there was no way to shut down the brick, except though a remote connection (for me, ssh), or by removing the battery. No need to panic, though (it did surprise me quite a bit, when I couldn't turn off the brick with the "escape" button). The brick is actually responsive though the remote connection.

Moving on, below is my first actual experiments with starting up leJOS on my brick. At first I didn't have any wifi-dongle (I had to wait a bit for that part to become available), so I tried to see if I had created the sd-card correctly under the assumption that the lack of wifi-dongle would make it quite a bit more challenging to connect to the the brick remotely.

Hardware used with brick
leJOS startup screen if no wifi-dongle is used

 Everything so far, so good. Nothing out of what I expected.
Some days later my experiments could continue, now with a wifi-dongle. This resulted in the following:

leJOS running with a wireless connection to my router

Yay. Now to test out the remote control of my now leJOS-powered brick:

Screenshot of the display seen in below video



Now that I have my hello world sample up and running, I foresee no particular difficulty in actually controlling the brick, but these experiments will probably be a bit drawn-out, as my exam period has just begun...
But I will be able to upload a post showing a running robot during the next few weeks (or a post describing a lot of problems related to getting said robot up and running).

Thursday, October 3, 2013

First experiments with the LEGO software

First experiments with the LEGO software

Developing the software for running the brick with EV3 is fairly easy. On a Windows 8 machine, I needed to install 2 things from the LEGO site. Both installed with no issue and no real difficulty. After having both of these installed I started the software and was presented with a wealth of tutorials and quick-start guides. After about 5 minutes of watching one of the relevant (for me - development) videos, I was ready to get started - or so I thought. First I had to update the Brick's firmware. This was easily accomplished in about a minute with no problems. Now I was good to go for my initial development.

Drag'n'Drop Programming. The above does nothing, but shows a few "building-blocks"
Quickly after creating a new project, I saw the above scary image: Drag'n'Drop programming. One of my dreaded enemies and a practice I loath. However, after trying it out I have to admit that LEGO actually have a pretty decent interface and that I could probably program any behavior I wanted for any given robot, with this approach. What's even more impressive, is that while the robot is hooked up to the pc, I can with the press of a key (or the big green arrow) execute the program on my brick. The uploading and execution of the program works smoothly and (for me, at least) very intuitively.
I made several small experiments involving making a wheel revolve 360 degrees, reading what the light-sensor recorded and stuff like that. Nothing worth posting, as it mostly involved connecting 2 programming components and pressing "Run".

All in all it was a much more pleasant experience then I expected upon seeing the programming interface.

Moving on from the software, I also built the robot detailed in the physical user-guide. The guide claims that you can build the robot in 45 minutes, but I guess I must either be a little slow, or it meant any given version of the robot. I spent about 1 hour to build the robot with all extensions applied to it, and only about 20 min to build the robot without any extensions. The 1 hour mark was without any of the suggested programming of the sensors/actuators applied in a given step!
Her is the finished beast:

Start robot, top (30cm ruler)
Start robot, angle
The brick comes pre-programmed with a small demo-program, which makes the robot drive, turn and drive again. While during this a pair of eyes with decreasing amount of wakefulness are displayed on the brick. This can be seen in the below video:


Now all the above is nice a well, but I feel that I should at least describe the one irritation I experienced duing the above tests: The brick is quite slow at starting up and even worse at shutting down (2 tests, both same duration):
Startup:     ~30 seconds
Shutdown: ~45 seconds
And this is with the latest firmware update! Compared to my laptop (running Ubuntu 13.04, old HardDisk):
Startup:     ~  45 seconds
Shutdown: ~    8 seconds
I do have hopes that this will change in the future, but seeing as the brick spends a huge amount of time on the shutdown procedure, I expect that something isn't quite right in that department.

EV3 - what is it all about

EV3 - what is it all about

The newest, at time of writing, LEGO Mindstorms box-set is called EV3. This blog will describe my experiments and (hopefully) insights into the workings of this toy.
I will be focusing upon the software that makes the Mindstorms solution work, but in order to fully understand the implications of a given design-choice, I expect to look into both the physical aspects of the hardware and (to a lesser degree) the usability from the users point of view.
My blog will be experiment-driven with little to no focus upon a overall coherency. My goal is to uncover and explore/experience fascinating details of how this brick works in relation to leJOS.

The box

Upon receiving the box I quickly noticed the high degree of similarity with the previous version NXT. To me, this wasn't a bad thing. The fact that LEGO tries to keep what worked from the previous edition seemed promising. Her is a few pictures:

Box from side, ~43 x 21 x 17 cm (space needed for storage)
Box from top
Opening the box up revealed a familiar sight:

Box initial
Box sorted

The actuators and sensors look deceptively similar to their NXT-version. From this point on, I naturally decided to explore what, exactly, the difference was between NXT and EV3.