Announcement

Collapse
No announcement yet.

I2C

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • I2C

    Hello, I am trying to program an I2C sensor that requires you to send and receive queries without using registers. How can this be done using Android Studio?

  • #2
    Edit: As in what would be the steps necessary to make a simple program to read raw I2C output without registers.

    Comment


    • #3
      Originally posted by 4262KJ View Post
      Edit: As in what would be the steps necessary to make a simple program to read raw I2C output without registers.
      What sensor are you trying use and do you have a copy of that sensor's datasheet?

      Comment


      • #4
        I am trying to use a Pixy camera as an I2C device. I have found 3 protocols that can be used over I2C. One constantly sends a stream of bytes, and useful information is marked by the sensor sending the word 0xaa55 twice which means that the next 16 bytes are useful information. The second one waits until you send a byte, then sends a response with the information requested by that byte (for example, sending 0x50 causes the camera to send back information about the largest object it can see). The third way appears to use registers to run, but I am not sure because the documentation is confusing. I believe that this means that the camera is usable if I can get the raw I2C output and look for the 2 bytes 0xaa55. Also, I have attached information on each of the three protocols.
        First Protocol
        Second Protocol
        Third Protocol

        Comment


        • #5
          Originally posted by skatefriday View Post
          What sensor are you trying use and do you have a copy of that sensor's datasheet?
          I am trying to use a Pixy camera as an I2C device. I have found 3 protocols that can be used over I2C. One constantly sends a stream of bytes, and useful information is marked by the sensor sending the word 0xaa55 twice which means that the next 16 bytes are useful information. The second one waits until you send a byte, then sends a response with the information requested by that byte (for example, sending 0x50 causes the camera to send back information about the largest object it can see). The third way appears to use registers to run, but I am not sure because the documentation is confusing. I believe that this means that the camera is usable if I can get the raw I2C output and look for the 2 bytes 0xaa55. Also, I have attached information on each of the three protocols.
          First Protocol
          Second Protocol
          Third Protocol

          Comment


          • #6
            I swear I have replied this thread but my reply disappeared. I probably forgot to hit the reply button at the end. In any case, I was suspecting you were using the Pixy camera and I am interested in the answer to this thread because we are using the pixy camera in FRC this season and just ported the code to FTC, but the only missing piece is a wrapper class that provides the platform dependent access to the I2C bus. We are also stuck at calling I2C read without register address. In FRC, the WPILib I2C class provided two methods: read(int registerAddress, int length, byte[] buffer) and readOnly(byte[] buffer, int length). So, if the device doesn't need register address, we can just call readOnly. There is no equivalent of this in the FTC SDK. So I am interested in the answer too.
            Regarding the "three protocols" you listed, only the first protocol is relevant. The other two are for the Pixy Adapter from Mindsensors.com. So unless you have the Pixy Adapter and are planning to connect it through the Legacy module, it is not relevant.

            Comment


            • #7
              Originally posted by mikets View Post
              Regarding the "three protocols" you listed, only the first protocol is relevant. The other two are for the Pixy Adapter from Mindsensors.com. So unless you have the Pixy Adapter and are planning to connect it through the Legacy module, it is not relevant.
              The second one is using the LEGO firmware that can be loaded through PixyMon, I'm thinking of using that and plugging in the camera as a normal I2C device. And for the other, I want to get the pixy adapter and plug that in as a normal I2C device as well.

              Comment


              • #8
                Originally posted by 4262KJ View Post
                The second one is using the LEGO firmware that can be loaded through PixyMon, I'm thinking of using that and plugging in the camera as a normal I2C device. And for the other, I want to get the pixy adapter and plug that in as a normal I2C device as well.
                Our code supports the "First protocol" you listed. It works very well in FRC but like I said, we are missing the "no register address" access code in FTC.

                Comment


                • #9
                  Originally posted by 4262KJ View Post
                  I am trying to use a Pixy camera as an I2C device. I have found 3 protocols that can be used over I2C. One constantly sends a stream of bytes, and useful information is marked by the sensor sending the word 0xaa55 twice which means that the next 16 bytes are useful information. The second one waits until you send a byte, then sends a response with the information requested by that byte (for example, sending 0x50 causes the camera to send back information about the largest object it can see). The third way appears to use registers to run, but I am not sure because the documentation is confusing. I believe that this means that the camera is usable if I can get the raw I2C output and look for the 2 bytes 0xaa55. Also, I have attached information on each of the three protocols.
                  First Protocol
                  Second Protocol
                  Third Protocol
                  We are using the Pixy camera on our robot with out any problems. We use it with the Lego firmware loaded. In this mode, the i2c address of the Pixy is 0x01 and you can use the registers 0x51-0x57 to access the largest blob of signatures 1-7. You just need to connect the four wires (vdd, gnd, sda, scl) from the Pixy camera to the DIM.

                  We tried to use the normal firmware with on the Pixy that sends a stream of bytes with the preamble of (0xaa55, 0xaa55) but since the camera processes frames at 50hz (20ms) and we were not able to read from the DIM faster than ~30ms, we could only read ~20 bytes before the next frame overwrote the current frame. We would only get data for one blob. We are still working on this problem.

                  Please tell us if you need more information.

                  Comment


                  • #10
                    I've been trading emails with somebody senior at CharmedLabs (pixy Mfg). They'd like to be put in touch with somebody who could either a) help them make their firmware more FTC friendly or b) connect with somebody the FTC SDK team so that together they could coordinate having pixy be better and more directly supported.

                    I had suggested they join the forum here, but they report their attempt to create a forum account was rejected. (Perhaps because they are not a team or approved vendor??? I have no idea how that works)

                    I am not helpful when it comes to the i2c software in question. But if somebody who does know how to help would like to PM me or leave a message here, I'll send you the contact info.

                    Z

                    coach 8381

                    Comment


                    • #11
                      What do they have in mind to make it more "FTC friendly"? We have used the pixy camera in our FRC competition so I am pretty familiar with how to communicate with it via I2C, Serial or even Analog and Digital input. The complication was to parse the data block but nothing can't be handled by a state machine. Our library now has a module that handles the parsing of the data block. The only thing I can think of that they could improve is the object recognition because while we were playing with the Pixy camera, I got an impression that their algorithm is to recognize a colored blob not too much of a shape. So any object that has the same color will be "detected" which caused false positives a lot. So we had to come up with filtering algorithm to tell the false detected object apart from the real ones. I can see the Pixy camera would have a hard time being used in FTC because everything is either RED or BLUE and the field has a lot of those colors.

                      Comment


                      • #12
                        I'm told that the connection was made and the right folks are talking.

                        Not sure what if anything will come of it. I'm hoping for a future SDK with an easy-for-kids-to-use method in it dedicated to pixy, as has been done with other sensors, because not all teams (certainly not my kids) have the scale and abilities to produce their own libraries of such things. Sadly they lack a real software coach.

                        I don't think pixy will ever enable the kinds of things possible with Vuforia or other image processing libraries, but I do think it will give less software endowed teams the ability to do some vision work. And if I really got my wish, pixy would enable some of their various gamma and contrast settings to be modified by the kid's FTC code, so basic adaptation to field conditions is possible.

                        I've been having the summer kids on our team play with it in analog mode. And yes, it is just color blob detection. But it's very good at it, and you can train for multiple colors to be reported separately by size and location (though obviously that's not coming out in the simple analog I/O). The code to get at that info over I2C is beyond my kids, but they certainly could make use of it if they got a little SDK jump start on it.

                        More interesting to me was the ability it has to train on multiple color key combinations, like red-next-to-blue. And when it detects those it gives not only the location by the angle of the line of transition between red-and-blue. So for instance on last season's field, with the red-blue boundary in the middle of the field, a robot with only a pixy cam could look at that border and use it to orient its rotation in the middle of the field, possibly correcting for all the bumping and thumping or gyro drift it saw on the way to getting there. Or The kids mounted up pixy and had it look at floor tape like that and the phi angle output seemed really accurate and stable at first glance.

                        Comment


                        • #13
                          Mechromancers: My kids have been trying to get their pixy to work with Lego firmware. Having struggled and failed with the non-lego protocols. At the moment when they access 0x50 they can just get the first byte of the block. And they can thus see the signature code changing when there is something for pixy to detect. So they can be pretty sure they have it all hooked up and working properly on the hardware side. But... the coder student can't figure out how to get all six bytes. Each loop() cycle they just see that first byte coming back. I think they are using the read method for i2cdevicesynch in their code. And the hardware is the Rev Expansion hub (if that makes a difference)

                          I'm no coach when it comes to software, so if there's a different method they should be using, or a syntax for getting i2cdevicesynch to return a specified number of bytes into an array instead of just one, I'd love a tip so I can help unstick them.

                          thanks in advance.

                          Z

                          Comment


                          • #14
                            Originally posted by zain View Post
                            Mechromancers: My kids have been trying to get their pixy to work with Lego firmware. Having struggled and failed with the non-lego protocols. At the moment when they access 0x50 they can just get the first byte of the block. And they can thus see the signature code changing when there is something for pixy to detect. So they can be pretty sure they have it all hooked up and working properly on the hardware side. But... the coder student can't figure out how to get all six bytes. Each loop() cycle they just see that first byte coming back. I think they are using the read method for i2cdevicesynch in their code. And the hardware is the Rev Expansion hub (if that makes a difference)

                            I'm no coach when it comes to software, so if there's a different method they should be using, or a syntax for getting i2cdevicesynch to return a specified number of bytes into an array instead of just one, I'd love a tip so I can help unstick them.

                            thanks in advance.

                            Z
                            We also wanted to look into using Pixy, but I don't think it's possible with the limitations in the SDK on what I2C transactions can be issued. In fact, it comes from the hardware limitations of the Modern Robotics CDIM. If you look at the Arduino Pixy I2C implementation, you'll see that all reads consist of a bunch of calls to getWord(). You'll notice that neither of these methods specify register numbers to read or write from; they just send and receive raw bytes. This is perfectly allowed by I2C, but the CDIM is only capable of working with a smaller subset of I2C calls. It requires that when you read or write, your transaction starts by writing the register byte. But Pixy has no concept of registers, and so this model doesn't work for it. In fact, once you send the write byte, Pixy thinks it is the start of a command you are trying to give it. Not sure if there is a way to work around this, but it seems to me that this means it is impossible to implement the Pixy protocol within the SDK.

                            Comment


                            • #15
                              If you download the lego firmware to the pixy it *does* know about registers. See attached PDF. But... I think the SDK is still the limitation? Perhaps only on Rev? (Since Mechromancers says they got it working?) What seems to need to be done is six successive reads of the *same* I2C register address, before closing (or whatever the right word for it) the I2C transaction. And the provided methods in the SDK allow the students to read one byte from one register address. Or to read successive single bytes from successive registers. But you can't (or we can't figure out how to) do a set number of reads from a fixed register in order to unload all 6 bytes associated with each detected blob.

                              Possibly complicating things is that the kids are doing this with the new Rev expansion hub, not the CDIM. Since a 5V powered Pixy is still 3.3V I2C compliant, it was easy to run Pixy from the aux 5V on the Rev hub, and connect the right I2C signal wires between the two. In MR set up menus there was a generic I2C device was "I2C Device" and used the I2CDevice class. In Rev it seems that menu choice is now I2CDevice (Synchronous) and uses the I2cDeviceSynch class. So I do wonder if some people have made it work in MR but the same code would not work in Rev hardware. Or possibly this change appeared in the SDK as things moved along and we erroneously associated it with when we switched between MR and Rev hardware.

                              In any case: It sure *seems* like if Pixy is in Lego Firmware and if there was a way to read the *same* register multiple times in an I2C transaction, then this would work. But right now they are limited to just getting the first of the six bytes of blob data out.

                              Hope that made sense (again, mechanical non SW coach here). And hope some folks (or the SDK team) have some ideas for how to make this work for the kids. Because Pixy has been super fun for showing students the really basics of this stuff. And having seen in real time what's going one, they will better make the leaps to color filtering, morphological operations, blob detection and centroids, and all the basics that lead into that world.
                              Attached Files

                              Comment

                              Working...
                              X