Announcement

Collapse
No announcement yet.

TensorFlow Lite Tutorial

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • TensorFlow Lite Tutorial

    Hi Folks,

    There is a TensorFlow Lite Tutorial that shows how to (using Blocks or Java) detect Gold and Silver Minerals using Google's TensorFlow technology.

    https://github.com/ftctechnh/ftc_app...eral-Detection

    Tom

  • #2
    Tom- what is the label used for silver mineral?

    Comment


    • #3
      The label is "Silver Mineral".

      Comment


      • #4
        Originally posted by Tom Eng View Post
        Hi Folks,

        There is a TensorFlow Lite Tutorial that shows how to (using Blocks or Java) detect Gold and Silver Minerals using Google's TensorFlow technology.

        https://github.com/ftctechnh/ftc_app...eral-Detection

        Tom
        Thanks for hanging in so late (on a Friday) to get this done Tom.

        I'm sure the many teams that meet over the weekend or (like us) have events this weekend will really find this useful.

        We made it this far, so we can verify that it's pretty plug and play...

        redfish tensor progress.jpg

        Michael P Clark
        Founding Mentor, FTC 9958
        http://www.redfishrobotics.com
        "We're Hooked on FIRST"

        Comment


        • #5
          Originally posted by Comrade 17 View Post
          The label is "Silver Mineral".
          Thanks. We will try today

          Comment


          • #6
            Has anyone gotten this to work reliably? We have found that it does not reliably see all three minerals simultaneously. It seems very dependent on lighting and orientation of the blocks. It mainly has issues with the silver minerals.

            Comment


            • #7
              You can modify the code to only look for gold minerals and then base your autonomous on where the lowest gold mineral is.
              CHEER4FTC website and facebook online FTC resources.
              Providing support for FTC Teams in the Charlottesville, VA area and beyond.

              Comment


              • #8
                Originally posted by Cheer4FTC View Post
                You can modify the code to only look for gold minerals and then base your autonomous on where the lowest gold mineral is.
                Thanks, after a lot more testing, we had to abandon the idea of sensing multiple minerals at the same time as it just will not work reliably. We are now just focusing on the gold mineral, but have not decided whether it is possible to determine the relative position reliably in the field of the phone, or simply just get closer to the mineral and determine if it is gold or not.

                Comment


                • #9

                  For reliable results from the TensorFlow Example OpMode (in Blocks), make sure the phone is set to Auto-rotate.

                  This is clearly explained here:
                  https://github.com/ftctechnh/ftc_app...ge-orientation

                  The overall OpMode documentation is here:
                  https://github.com/ftctechnh/ftc_app...ection-Op-Mode


                  Without this, your tests so far have displayed the Gold mineral's relative vertical position in the image, rather than its relative horizontal position. This would randomly seem 'right' or 'wrong', creating much frustration. Been there!

                  If for some reason you want the phone locked to Portrait mode, in the Example code (in Blocks) simply change the 3 instances of Recognition.Left to Recognition.Top

                  I will post this note in the several threads reporting unreliable TensorFlow results.

                  Comment


                  • #10
                    Thanks, I also posted in multiple threads that this is not the issue we are seeing, we have been in auto rotate from the beginning. The issue is due to incorrect modeling of the silver mineral, IMHO.

                    Comment


                    • #11
                      We noticed that the silver minerals don't show well on our white floor. We can read all three best when we put our gray mat on a dark floor. I don't know how this will translate in competition though. We might try increasing our accuracy above 50% and see how that affects the outcome.

                      Comment


                      • #12
                        Silver doesn't work in many conditions. I am fairly confident that they screwed up when modeling them. I believe they modeled them alone with a contrasting background. My theory is that when you put the white balls on the red and blue squares, the red and blue peeking out from the bottom screws up the detection. We have done a lot of testing, and TensorFlow detects the white balls just fine on a contrasting background, but when you put them on the red or blue squares, the detection gets screwed up. As the light levels increase, especially in bright gymnasiums, the detection rate falls to about 50% in these conditions. It is far worse on the red squares.

                        They should have modeled the white balls in real world conditions, which is on red and blue 2 inch squares, using light similar to a typical gymnasium. (real world competition scenario)

                        Comment


                        • #13
                          Here is actual training model
                          https://github.com/google/ftc-object...aster/training
                          and here is tutorial to model from scratch-
                          https://medium.com/tensorflow/traini...s-b78971cf1193

                          i told my team to explore this and try modeling themselves. Let us see far we can go given upcoming competition and pending robot build
                          but suggest other team can try

                          Comment


                          • #14
                            Hello, we are trying to use the object detection example using tensorflow Lite and running into issue with reliability. reading the tutorial text it indicates a way to change the confidence level one wants to use. would someone tell us where this needs to be set? do we add the statement
                            tfodParameters.minimumConfidence = 0.75; in the initTfod() function itself? thanks for your help. also, is there a way to turn on and off the phone LED to illuminate the FOV? we see big swings based on the lighting conditions and wonder if the light from the LED will help. thanks for your assistance. Regards vivek

                            Comment


                            • #15
                              Our team determined our field of view was not wide enough to reliably see all three minerals. We therefore decided to turn the robot slightly to guarantee which 2 minerals it sees and adjusted the program logic to use the two minerals we see.

                              Comment

                              Working...
                              X