Announcement

Collapse
No announcement yet.

Odd problem with Core Device Interface (seems to impact both Color Sensor and ODS)

Collapse
This topic is closed.
X
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Odd problem with Core Device Interface (seems to impact both Color Sensor and ODS)

    TLDR: sensors returning 0 values when they shouldn't, looking for help

    Ok, I'm usually good at finding some of the subtle hardware interactions that crop up from time to time. They make good educational opportunities. Unfortunately, this one has me a bit stumped as our team has been working on improving their autonomous line follow routine.

    Symptom: Using the Color sensor (properly mounted) to detect crossing the white line. Occasionally (but with too high of a frequency to ignore) their robot would miss the white line. So I had them show me their threshold values and they added code to their programs to show values during init. All looked proper. They're moving faster this year than last and the Color sensor only has an output rate of 30 Hz. Perhaps the dwell time over the line was too short. Back of the envelope showed 2 to 3 sample opportunities, but still, perhaps they were unlucky. They wrote an automated program to read the color sensor while driving back and forth across the line. Hmm, about 70% successful detection, way too low. A little digging suggested using an ODS instead as it has a sample rate of 1000 Hz and better resolution as well.

    The swap was made, code updated, and alas, still a problem, and at about the same failure rate. Again thresholds checked for the new sensor. All looked good (and much better resolution compared to the Color sensor). So looking at testing code (I'm not a java programmer, but an old C hack so I can read most of it), it looks like perhaps they are in a tight fast loop while probing the sensor module. Could there be a blocking issue? Could idle() be holding too long? How many samples were they taking and how many were they seeing white on when successfully detecting the line? Add more diagnostics to the code (see below). Turns out the max latency was about 25 milliseconds with idle(). That should be fine. Average time was really fast (<1 ms) so perhaps they were flooding the CDI with requests and nothing was getting updated? Had them insert a sleep(5). Now average loops was just over 5ms, max latency still less than 30 ms. Yet we were still getting crossings where no line was seen, but on crossings where it was seen, it would take about 200-400ms between first and last white sighting, with perhaps 20-40 "white" counts (numbers from memory). Ok, so the sensor was seeing "white", and frequently enough even with sleep(5) on a majority of the crossings (sensor was 0.15 mat, 0.99 white, detection set at 0.3). Had them run it over the plain mat and it reported no white hits at all, so it seems to be working well on that case.

    I'm at a loss. Does anyone know the how fast the CDI can respond to requests? We For that matter, what about the entire USB bus? Consider a drive loop where the users are grabbing gyro, reading sensors & encoders, adjusting power, etc while driving, and putting it in a tight loop. Add in telemetry and this seems like a potential traffic jam. Or is the bandwidth plenty high?

    Below is the fragment of test code that might prove useful if I've missed anything. We also swapped the USB cable just in case we were getting disconnects but no improvement. The only thing we didn't swap was the CDI itself.
    Code:
    private void moveByInches(double inches, double drivingPower) throws InterruptedException{ // moves 26.5 in one rotation
            telemetry();
            int counts = motorRight1.getCurrentPosition();
            double sign = Math.round(inches/Math.abs(inches));
            powerMotors(sign*drivingPower, sign*drivingPower);
            int counter = 0;
            int iters = 0;
            long ms = System.currentTimeMillis();
            long cms;
            long maxdiff = 0;
            long sumdiff = 0;
            long firsttime = -1;
            long endtime = -1;
            while (opModeIsActive() && Math.abs(motorRight1.getCurrentPosition() - counts) < Math.abs(ENCODER_COUNTS_PER_ROTATION*inches/26.5)){
                idle();
                sleep(5);  // to delay between readings in hopes of not flooding CDI with requests
    
                cms = System.currentTimeMillis();
                if (opticalSensor.getLightDetected() > 0.3){ // ODS is 0.15 on mat, 0.99 on white line
                    counter++;
                    if (firsttime == -1){
                        firsttime = System.currentTimeMillis();
                    }
                }else{
                    if (firsttime != -1){
                        endtime = System.currentTimeMillis();
                    }
                }
                sumdiff += cms - ms;
                if (cms - ms > maxdiff) maxdiff = cms - ms;
                ms = cms;
                iters++;
            }
            telemetry.addData("WhiteTime: ", endtime - firsttime);
            telemetry.addData("Maxdiff: ", maxdiff);
            telemetry.addData("Avgdiff: ", (double)sumdiff / (iters));
            telemetry.addData("Counter: ", counter);
            telemetry.addData("Iterations: ", iters);
            telemetry.update();
            powerMotors(0,0);
        }
    Technical Coach, Newton Busters FTC 10138, FLL 3077
    Chicago 'burbs, IL

  • #2
    Please remember your code is running in a preemptive multi-tasking environment. Even if you have a tight loop, some other threads may preempt you and cause delay. Since you already have profiling code to show you your effective sampling rate, it may be that you should just slow down the robot when looking for a line. That's what we did.
    Also, it is true that the sampling rate of ODS is faster than the color sensor. If I remember correctly, ODS plugs into an Analog port but color sensor is I2C. I2C bus transaction takes time, so it can't do as fast as the ODS doing Analog-to-Digital conversion. It makes it even slower if you have multiple I2C sensors which shares the bandwidth of the I2C bus.

    Comment


    • #3
      The concern about being preempted is why I had them monitor max loop time. At less than 30ms max, with an average of just over 5ms, this would eliminate the possibility of missing the line due to being blocked.

      Any idea on the bandwidth of the I2C bus? Using the ODS eliminates this as a concern, but I'm curious just the same.
      Technical Coach, Newton Busters FTC 10138, FLL 3077
      Chicago 'burbs, IL

      Comment


      • #4
        It's been a while I read the I2C spec. You can probably search the spec on-line. I think I2C spec has evolved all these years and can support multiple speeds. So the question is really what is the I2C bus speed for the CDM. You may want to look up the CDM spec and see if it has mentioned anything about that.

        Comment


        • #5
          Just looked at the CDM spec.
          I2C port 100KHz, with maximum 27 byte data buffer per port
          So the I2C bus is 100KHz.

          Comment


          • #6
            I don't want to get too far afield from the main question here. This still doesn't explain why, even with an analog device (the ODS), that they are missing the line. The code above is quite simple. The measured MAX delay indicates that there are no delays long enough to account for missing the line due to being preempted. Something else seems to be happening. I suspect it's something more subtle between the CDI and the SDK in how the sensor data is refreshed. Hoping to find someone more familiar with the internals on the SDK.


            Now, back to the I2C question, I'm not an I2C guru but looking at the specs on the protocol, lets estimate the bandwidth at 10kB/s (100kb/s and round the protocol to 10 bits per byte ... yes I see the actual protocol should be 9 bits but I'm ignoring other things).

            So (and correct me if my estimates are way off), a gyro read would require a command and address (2 bytes) followed by a return of 18 bytes of data from the sensor. A range sensor would be 2 + 6 bytes, and a color sensor a whopping 2 + 30 bytes. So a robot running along trajectory, using the gyro to control navigation, using a range sensor for collision avoidance, and a color sensor to look for the white line requires a minimum of 60 bytes of data per "iteration". 60Bytes / 10 kB/sec -> 6 msec per iteration (ignoring all other overhead). Does this sound right to everyone? This is the MINIMUM time one could expect a data return from the devices, just for the I2C communication.

            Now add to this the overhead for the CDI itself, which seems to do a block transfer of all the data at once for every port. I'll have to dig in more and perhaps learn enough Java to do some testing of my own (I'm curious about this myself).
            Technical Coach, Newton Busters FTC 10138, FLL 3077
            Chicago 'burbs, IL

            Comment


            • #7
              Originally posted by gof View Post
              I don't want to get too far afield from the main question here. This still doesn't explain why, even with an analog device (the ODS), that they are missing the line. The code above is quite simple. The measured MAX delay indicates that there are no delays long enough to account for missing the line due to being preempted. Something else seems to be happening. I suspect it's something more subtle between the CDI and the SDK in how the sensor data is refreshed. Hoping to find someone more familiar with the internals on the SDK.
              When considering delays and latency, you need to consider the entire communications cycle.

              Regardless of the final sensor/actuator device, every hardware access must go through the phone's USB bus. This takes a finite amount of time to simply establish a USB connection, access the required device, and then close down the connection. The more devices you access, the longer the overall loop cycle takes.

              The SDK code attempts to ensure that every sensor read is fresh, (which is currently implemented as a blocked operation), and that all actuator commands go out before proceeding on to the next command (which also causes blocking). Adding treads and other artifacts doesn't help much here as everything has to go out the same USB bus connection.

              To reduce bandwidth requirements, identical (redundant) actuator commands (motor powers, mode changes) are suppressed.
              So, sending out the same motor power command each loop takes much less time than sending out a continually changing one.

              Looking at the theoretical data rates is fine, but in reality the android/usb/firmware/hardware path can tell a very different story.

              As a crude rule of thumb, I assume about 8-10 mS for each device access. This varies, and may depend on the actual action/sensor.
              So a stand-alone sensor test may give a 10ms loop time, but if you ADD two encoder reads, and a steering adjustment to 2 motors, you may end up with a 50mS loop time.

              Where did I come up with these numbers?
              My first few months of opmode testing always included a graphical diagnostics display line on the DS phone which simply showed the min, max and avg loop cycle times as a bar. (accumulated over a second)
              It was quite enlightening seeing the bar bounce up and down.

              So I recommend a simple timer add on that measures the loop cycle time (in ms) and displays it. This will give you a better feel for what to expect in terms of response times.

              Phil.

              Comment


              • #8
                Originally posted by Philbot View Post
                As a crude rule of thumb, I assume about 8-10 mS for each device access. This varies, and may depend on the actual action/sensor.
                So a stand-alone sensor test may give a 10ms loop time, but if you ADD two encoder reads, and a steering adjustment to 2 motors, you may end up with a 50mS loop time.
                Thanks Phil, as always!

                So if we were really updating 8 motors and 4 servos every loop, would it be around a 100-120ms loop cycle? And if the loop also read the encoder values from all 8 motors, the loop cycle would slow down by an additional 60-80 ms?

                Do you have any experience with the legacy modules and how access times changes if they're being used? Would 8 motor updates and 4 servo updates take a different amount of time if they were all legacy updates going through a legacy module?

                Do you have any experience with update rates for sensors, both using the DIM and using legacy Lego sensors and the legacy module? For example, if a robot used 1, 3, or 5 legacy light sensors, would the access time be 1x, 3x, and 5x as long? Or does the legacy sensor do its own polling of the devices and give all of the results in a single USB read?

                Comment


                • #9
                  Please notice the code included in the first post. They did test the per-loop latency, both Max and Average. Max was less than 30 ms, and average was always about 5.1 ms WITH the sleep(5) and the idle(). So latency or blocking wouldn't seem to be an issue. The remaining piece of the puzzle is how the updates from the CDI are handled by/into the SDK. Since they read both the motor encoder AND the ODS, it seems in conflict with your estimate of 8-10 ms per access. They had << 1ms average before adding the sleep(5). It would appear, based on the timing evidence, that there may be some buffering of the sensor data. I'm just not sure where to start digging in the SDK to find the specifics. The timing data gleened from the code above doesn't seem to match your description of the protocol. Hence our confusion.

                  But, even with 20 ms of delay, it shouldn't impact detection of the line. I guess the next step is to buffer ALL the readings in an array, then dump them to a file after the run and plot them to see how the sensor data changes in real-time. Then examine them like an oscilliscope trace looking for rise and fall of the sensor values where the line crossing should be.
                  Technical Coach, Newton Busters FTC 10138, FLL 3077
                  Chicago 'burbs, IL

                  Comment


                  • #10
                    Like what Phil said, it's a good exercise to check the theoretical maximum sensor rate but in reality you also need to consider the FTC SDK. If you are reading the sensors raw (i.e. not using any FTC SDK APIs and deal with raw I2C bus transactions, USB communications etc) you may get close to your theoretical rate. But with FTC SDK, it does a lot of behind the scene things. It may not read the physical sensor every time you call it. This is especially true for I2C. I know because I wrote similar code when adding I2C support to our library. I have a cooperative task block reading the I2C sensor at the rate of my choosing and when the caller calls for data, I returned the data in my cache. So you see if you are trying to determine the sensor data rate, you will be deceived.

                    Comment


                    • #11
                      I think this is my point. I'm trying to find out why they are getting misses when:
                      1. Theory says data should be available in a reasonable amount of time
                      2. Loop timing (see code in first post), supports that there are no issues on timing
                      3. Data from when it does work supports that there should be a lot (> 20) of positive reads during line crossing event.


                      It's not just the theoretical max (only #1 above). It's also the timing data. If the code is looping, and the timing data confirms the max and average delays per loop, then can one expect that the sensor data returned from the calls is "fresh". It would appear not, but perhaps there's some subtlety I'm missing. Again, I'm not a java programmer. I'm from the Assembly and C era. I'm expecting any blocking to be manifested in delays in returning from idle(), or while waiting for the sensor calls (getCurrentPosition and getLightDetected) to return. Since the experimental timings DON'T show these types of delays, I'm looking for the next likely source.
                      Technical Coach, Newton Busters FTC 10138, FLL 3077
                      Chicago 'burbs, IL

                      Comment


                      • #12
                        Ok, here's a follow up to this. We had a chance to rerun the code in the original post while recording the loop time (in milliseconds), encoder, and Optical Distance Sensor data. This was recorded for every point along the 10 inch path as the robot moved back and forth across the line. The only thing I wish was also in the raw files is total time to make graphing by time easy as well, but I think this demonstrates the problem sufficiently. Here is a link to all the raw data as well as the spreadsheet showing the charts I'll discuss now. Here is a link to all the raw data as well as the spreadsheet, (and PDF's & jpgs of the charts) showing the charts I'll discuss now in case anyone wants to look at the data (it's all in the spreadsheet as well, the txt files are comma delimited for your data mining pleasure).
                        https://drive.google.com/drive/folde...Wc?usp=sharing

                        Ok, if you look again at the original post, the code shows the routine just drives straight, reading the light sensor and keeping track of when the sensor exceeds a threshold of 0.3. The sensor shows about 0.08 to 0.15 when going across the mat, and rises to almost 1.0 when directly over white gaffers tape (the line). Two drive speeds (using encoders) were used, 0.3 and 0.4 each with successful and missed line events. We also tested 0.2 (but I didn't get files for those yet), and those never missed the line (out of 80 runs). In all cases the loop timing confirmed that the data points are almost all 5-6 milliseconds apart with no significant delays. So what is happening....

                        Here is a (busy) graph showing many runs. In preview mode I had to right click on the apparent broken image and select "Open image in new tab", hopefully it will work better once posted.



                        0.3 speed runs are triangles, 0.4 speed runs are circles. The X-axis is the encoder distance. The Y-axis is the ODS value. At the bottom, I've provided a 1" Ruler (86 encoder counts) which would indicate the encoder width of the line. Note also the threshold was 0.3. Here you can plainly see that each crossing event that was successful appears as a single threshold line. At first glance, one might wonder why the 0.4 speed data lines are longer since it should just record the width of the line. Remember that 0.4 is moving faster! We have stale sensor data.

                        Lets look at that baseline zoomed in.



                        Here we can see very good detail of what is happening. In the sensor levels, you can clearly see the 10-bit granularity (approximate levels of 0.001, more accurately 1/1024). But each sensor reading persists between 75 ms and 100 ms. This shouldn't be! The ODS sensor has an advertised data rate of 1000Hz. While running across a well used mat, you would expect each reading to bounce around in the baseline range with minor imperfections of the mat. Unfortunately, we are only getting new data at a rate of 10 to 13 Hz. So detection of the line becomes a hit or miss proposition. At a higher speed, it's a lottery if the sensor value happens to have been "taken" over the line. As you slow down, the "width" of the sensor lag (seen in an encoder view of this chart) shrinks to the point it falls below the 1" level. You're still in trouble if you're trying to stop RIGHT AT the edge of the line, since that lag will still be there. For us, this explains why we were getting "overshoot" with the rotating until aligned on the edge of the line and seeming poor control on line following. We were thinking it was motor delay/inertia on stopping but from this data I think it's the sensor frequency. This might also be why we were seeing lag with the Gyro Sensor, but haven't tested to confirm this yet.

                        I'd REALLY like someone familiar with both the SDK and the hardware to comment. I emailed Modern Robotics with an inquiry about this sensor issue earlier today, but it was before this data. Also, if anyone else could run this on their bot and confirm/duplicate these results I'd appreciate it. I'd love to find out it's something else and not a hardware or SDK bug. Perhaps it's just us, but we've been simplifying the code as much as possible to eliminate potential sources of conflict. The encoder values seem to be updating correctly, so this seems centered on the CDI.
                        Technical Coach, Newton Busters FTC 10138, FLL 3077
                        Chicago 'burbs, IL

                        Comment


                        • #13
                          I am glad you are looking so deep into this issue. That made me curious too so I did a little experiment today with the I2C gyro and it surprised me. We always had a hard time tuning our PID constants for turns. So I logged some gyro data to see what's going on and I found that while the robot is turning, the gyro heading changes its value at an average interval of 300 ms even though our loop time is about 10 ms!!! It means the heading doesn't change for approx. 30 loops. Since our robot has 4 I2C sensors, I unplugged all other I2C sensors leaving only the gyro so to eliminate the I2C bus congestion issue but the 300 ms interval remained the same. Now this preliminary data is not very representative because I was using the competition code which still attempted to read the unplugged I2C devices. I am going to write a test opmode tomorrow with only gyro in it and test it again.

                          Comment


                          • #14
                            Question: Does this happen to you only in "linear" opmode, or can you make an "iterative" opmode that does the same thing?

                            I ask because my kids are doing their auto in iterative (or what was once called regular) opmode. And 4" wheels, tetrix motors, ~230 CPI, RUN_WITH_ENCODERS, will never miss a line. Or at least not that I've seen with them yet.

                            That said I've seen ample evidence of slow sensors while watching them work. Both in terms up update rate and "lag." Though we have no logs as good as yours.

                            And they do have telemetry on all their sensor mins and maxes and averages. And there are definitely times when the I2C sensors seem to go away, or just come back with zeros/255's. And their code has lots of lines that look like IF ((X==0) || (X==255) Then [ignore that data for now and...]

                            And as they do simple closed loop type things on gyro and gyro integrated heading and such, I've encouraged them to just stick with simple P or PD type things. In part because they understand those, and in part because loop times, lags, and update rates seem so variable that integrators will be a PIA to work with. And it's not much of a teachable moment to teach control law in such an obviously polling/blocking/out-to-lunching/whatever environment.

                            I do not think you are alone, or hardware unique.

                            One of the kids asked "how come line following was faster and more stable when we were doing FLL?"

                            Well... it seems to be a long story... please keep posing what you find!

                            - Z

                            Comment


                            • #15
                              I tested a bare bone practice robot with a simple opmode that supports only 4 wheels and a gyro. The average time that the gyro changes its value is approx. 80 msec. The average LinearOpMode loop interval is about 1 msec. So the gyro value on the average changes every 80 loops. That's not as bad as I thought although I could use faster sampling rate. Now, why is the competition robot has a 300 msec sampling interval on the gyro? It could be because of the other I2C devices on the bus sharing bandwidth. I am going to add I2C sensors to the bare bone practice robot one at a time to see if it will affect the gyro sampling interval. BTW, I have included the simple opmode code here because it's very easy to change to test other sensors.
                              Code:
                              package samples;
                              
                              import android.util.Log;
                              
                              import com.qualcomm.hardware.modernrobotics.ModernRoboticsI2cGyro;
                              import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
                              import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
                              import com.qualcomm.robotcore.hardware.DcMotor;
                              
                              @TeleOp(name="Test: Loop Timing", group="3543TestSamples")
                              public class TimingTest extends LinearOpMode
                              {
                                  private enum SensorType
                                  {
                                      DRIVEBASE_ENCODERS,
                                      GYRO
                              }
                              
                                  private static final String TAG = "TrcDbg";
                                  private static final double DRIVE_POWER = 0.2;
                                  private static final double TURN_POWER = 0.5;
                                  private static SensorType sensorType = SensorType.GYRO;
                                  private static final DcMotor.Direction LEFTWHEEL_DIRECTION = DcMotor.Direction.FORWARD;
                                  private static final DcMotor.Direction RIGHTWHEEL_DIRECTION = DcMotor.Direction.REVERSE;
                              
                                  private DcMotor lfWheel;
                                  private DcMotor rfWheel;
                                  private DcMotor lrWheel;
                                  private DcMotor rrWheel;
                                  private ModernRoboticsI2cGyro gyro;
                              
                                  public void runOpMode()
                                  {
                                      initRobot();
                              
                                      waitForStart();
                              
                                      long minLoopInterval = Long.MAX_VALUE;
                                      long maxLoopInterval = Long.MIN_VALUE;
                                      long loopCount = 0;
                                      long prevLoopTime = 0;
                              
                                      long minSampleInterval = Long.MAX_VALUE;
                                      long maxSampleInterval = Long.MIN_VALUE;
                                      long sampleCount = 0;
                                      long prevSampleTime;
                              
                                      long startTime = System.nanoTime();
                                      prevSampleTime = startTime;
                                      int prevSample = getSensorValue();
                              
                                      while (opModeIsActive())
                                      {
                                          long currTime = System.nanoTime();
                                          int currSample = getSensorValue();
                                          if (prevLoopTime != 0)
                                          {
                                              long loopInterval = currTime - prevLoopTime;
                              
                                              if (currSample != prevSample)
                                              {
                                                  long sampleTime = currTime - prevSampleTime;
                                                  sampleCount++;
                                                  prevSample = currSample;
                                                  prevSampleTime = currTime;
                                                  if (sampleTime < minSampleInterval)
                                                      minSampleInterval = sampleTime;
                                                  else if (sampleTime > maxSampleInterval)
                                                      maxSampleInterval = sampleTime;
                                              }
                              
                                              if (loopInterval < minLoopInterval)
                                              {
                                                  minLoopInterval = loopInterval;
                                              } else if (loopInterval > maxLoopInterval)
                                              {
                                                  maxLoopInterval = loopInterval;
                                              }
                              
                                              startRobot(String.format("[%4d:%7.3f] LoopInterval=%7.3f, ",
                                                      loopCount, (currTime - startTime)/1000000.0, loopInterval/1000000.0));
                                          }
                              
                                          prevLoopTime = currTime;
                                          loopCount++;
                                      }
                                      stopRobot();
                              
                                      long endTime = System.nanoTime();
                                      Log.i(TAG, String.format(
                                              "Loop: MinInterval=%7.3f, MaxInterval=%7.3f, AvgInterval=%7.3f",
                                              minLoopInterval/1000000.0, maxLoopInterval/1000000.0,
                                              (endTime - startTime)/1000000.0/loopCount));
                                      Log.i(TAG, String.format(
                                              "Sensor: MinSampleInterval=%7.3f, MaxSampleInterval=%7.3f, AvgSampleInterval=%7.3f",
                                              minSampleInterval/1000000.0, maxSampleInterval/1000000.0,
                                              (endTime - startTime)/1000000.0/sampleCount));
                                  }
                              
                                  private void initRobot()
                                  {
                                      lfWheel = hardwareMap.dcMotor.get("lfWheel");
                                      rfWheel = hardwareMap.dcMotor.get("rfWheel");
                                      lrWheel = hardwareMap.dcMotor.get("lrWheel");
                                      rrWheel = hardwareMap.dcMotor.get("rrWheel");
                                      lfWheel.setDirection(LEFTWHEEL_DIRECTION);
                                      lrWheel.setDirection(LEFTWHEEL_DIRECTION);
                                      rfWheel.setDirection(RIGHTWHEEL_DIRECTION);
                                      rrWheel.setDirection(RIGHTWHEEL_DIRECTION);
                              
                                      lfWheel.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
                                      rfWheel.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
                                      lrWheel.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
                                      rrWheel.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
                                      lfWheel.setMode(DcMotor.RunMode.RUN_WITHOUT_ENCODER);
                                      rfWheel.setMode(DcMotor.RunMode.RUN_WITHOUT_ENCODER);
                                      lrWheel.setMode(DcMotor.RunMode.RUN_WITHOUT_ENCODER);
                                      rrWheel.setMode(DcMotor.RunMode.RUN_WITHOUT_ENCODER);
                              
                                      gyro = (ModernRoboticsI2cGyro)hardwareMap.gyroSensor.get("gyroSensor");
                                      gyro.resetZAxisIntegrator();
                                  }
                              
                                  private int getSensorValue()
                                  {
                                      int value = 0;
                              
                                      switch (sensorType)
                                      {
                                          case DRIVEBASE_ENCODERS:
                                              value = (lfWheel.getCurrentPosition() + rfWheel.getCurrentPosition() +
                                                       lrWheel.getCurrentPosition() + rrWheel.getCurrentPosition())/4;
                                              break;
                              
                                          case GYRO:
                                              value = -gyro.getIntegratedZValue();
                                              break;
                                      }
                              
                                      return value;
                                  }
                              
                                  private void startRobot(String prefix)
                                  {
                                      switch (sensorType)
                                      {
                                          case DRIVEBASE_ENCODERS:
                                              //
                                              // Driving forward and checking encoders.
                                              //
                              lfWheel.setPower(DRIVE_POWER);
                                              rfWheel.setPower(DRIVE_POWER);
                                              lrWheel.setPower(DRIVE_POWER);
                                              rrWheel.setPower(DRIVE_POWER);
                                              Log.i(TAG, prefix + String.format("lf=%d, rf=%d, lr=%d, rr=%d",
                                                      lfWheel.getCurrentPosition(), rfWheel.getCurrentPosition(),
                                                      lrWheel.getCurrentPosition(), rrWheel.getCurrentPosition()));
                                              break;
                              
                                          case GYRO:
                                              //
                                              // Turning right and checking gyro.
                                              //
                              lfWheel.setPower(TURN_POWER);
                                              lrWheel.setPower(TURN_POWER);
                                              rfWheel.setPower(-TURN_POWER);
                                              rrWheel.setPower(-TURN_POWER);
                                              Log.i(TAG, prefix + String.format("heading=%d", -gyro.getIntegratedZValue()));
                                              break;
                                      }
                                  }
                              
                                  private void stopRobot()
                                  {
                                      lfWheel.setPower(0.0);
                                      lrWheel.setPower(0.0);
                                      rfWheel.setPower(0.0);
                                      rrWheel.setPower(0.0);
                                  }
                              
                              }

                              Comment

                              Working...
                              X