Announcement

Collapse
No announcement yet.

TensorFlow Lite Tutorial

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Tom Eng
    started a topic TensorFlow Lite Tutorial

    TensorFlow Lite Tutorial

    Hi Folks,

    There is a TensorFlow Lite Tutorial that shows how to (using Blocks or Java) detect Gold and Silver Minerals using Google's TensorFlow technology.

    https://github.com/ftctechnh/ftc_app...eral-Detection

    Tom

  • vmoudgal
    replied
    Hello Msdickc, if you are already able to detect the location of the gold sample, AND assuming that your Robot has high enough clearance to pass over the samples without dislocating them, I would suggest attaching servo enabled arms on the two sides, as well as at the bottom of the robot to move the gold sample. this way the robot can make the same movement irrespective of the location of the gold sample and knock it off its location by just activating the appropriate arm. you will need two Autonomous programs depending on where you start (depot side or the Crater side of the lander). WRT safe paths, 1. if you start at the alliance depot (Gold sample) side, move so that the robot can park at the crater on the left side of the Robot. if starting on the Crater side (Silver sample) move the robot to end up at the same Crater. these paths will minimize chances of collisions on the field. Also hug the walls to the extent safely possible while moving so that you avoid knocking the silver samples on your way to the crater. Good luck.

    Leave a comment:


  • Msdickc
    replied
    I am part of team #15237 and this year is our first year participating and as the programmer I need help with formulating the code needed to give the robot the ability to be able to detect between the two different materials. I already have multiple different autonomous programs depending on where we start a game with our robot (if the crater is behind us or if the depot is). I need help writing the code (I use blocks programming). I would also like if the example that you guys give me also has strings where based on where the materials are how the robot moves (just so that I can get a general idea that I can later fine-tune).

    Leave a comment:


  • MikeRush
    replied
    when trying to create a new op mode based on the "ConceptTensorFlowObjectDetection" sample, it is not found in the drop-down box. Is there an update or other file(s) that need to be downloaded to be used as a sample?

    Leave a comment:


  • 11343_Mentor
    replied
    Thanks Tom. As I mentioned in the other thread, might I suggest you try with lighting like a typical school gymnasium lighting? I don't know what they are using these days, used to be mercury vapor lights, but this type of lighting is where the TensorFlow did not work very well. All of our competitions in our area are in school gyms.

    Leave a comment:


  • Tom Eng
    replied
    Originally posted by 11343_Mentor View Post
    Thanks, I also posted in multiple threads that this is not the issue we are seeing, we have been in auto rotate from the beginning. The issue is due to incorrect modeling of the silver mineral, IMHO.
    11343_Mentor - I'll see if I can do some testing with a bright environment (i.e., lower contrast) and see how the silver mineral detection goes. thanks for the helpful info/feedbac.

    Tom

    Leave a comment:


  • Cheer4FTC
    replied
    11343_Mentor Could you respond to the following thread with details about your setup? Tom Eng is trying to reproduce the issue so he can debug and possibly improve the performance. In particular:

    . Are you using a phone camera or a webcam? Which brand?

    . What lighting environment are you using? Are there any lighting conditions that are particularly problematic?

    . Where is the camera mounted (low looking horizontally, or high looking down on the minerals)?

    . Is there anything else you notice that could be helpful for him to reproduce the problem?

    https://ftcforum.usfirst.org/forum/f...-and-blue-tape

    Leave a comment:


  • vmoudgal
    replied
    Hi Noah, Thanks for the link to the Camera Flash control commands. Will try it out. is there a way to increase the field of view of the camera (we are using landscape mode). thanks

    Leave a comment:


  • Noah
    replied
    Originally posted by vmoudgal View Post
    also, is there a way to turn on and off the phone LED to illuminate the FOV? we see big swings based on the lighting conditions and wonder if the light from the LED will help. thanks for your assistance. Regards vivek
    See this thread. https://ftcforum.usfirst.org/forum/f...9-camera-flash

    Leave a comment:


  • mjurisch2017
    replied
    Guess I should mention that you know the mineral set is made up of two silver and one gold. Using this logic and two of the mineral locations you can deduce the third minerals location.

    Leave a comment:


  • mjurisch2017
    replied
    Our team determined our field of view was not wide enough to reliably see all three minerals. We therefore decided to turn the robot slightly to guarantee which 2 minerals it sees and adjusted the program logic to use the two minerals we see.

    Leave a comment:


  • vmoudgal
    replied
    Hello, we are trying to use the object detection example using tensorflow Lite and running into issue with reliability. reading the tutorial text it indicates a way to change the confidence level one wants to use. would someone tell us where this needs to be set? do we add the statement
    tfodParameters.minimumConfidence = 0.75; in the initTfod() function itself? thanks for your help. also, is there a way to turn on and off the phone LED to illuminate the FOV? we see big swings based on the lighting conditions and wonder if the light from the LED will help. thanks for your assistance. Regards vivek

    Leave a comment:


  • FTC12676
    replied
    Here is actual training model
    https://github.com/google/ftc-object...aster/training
    and here is tutorial to model from scratch-
    https://medium.com/tensorflow/traini...s-b78971cf1193

    i told my team to explore this and try modeling themselves. Let us see far we can go given upcoming competition and pending robot build
    but suggest other team can try

    Leave a comment:


  • 11343_Mentor
    replied
    Silver doesn't work in many conditions. I am fairly confident that they screwed up when modeling them. I believe they modeled them alone with a contrasting background. My theory is that when you put the white balls on the red and blue squares, the red and blue peeking out from the bottom screws up the detection. We have done a lot of testing, and TensorFlow detects the white balls just fine on a contrasting background, but when you put them on the red or blue squares, the detection gets screwed up. As the light levels increase, especially in bright gymnasiums, the detection rate falls to about 50% in these conditions. It is far worse on the red squares.

    They should have modeled the white balls in real world conditions, which is on red and blue 2 inch squares, using light similar to a typical gymnasium. (real world competition scenario)

    Leave a comment:


  • FTC7039
    replied
    We noticed that the silver minerals don't show well on our white floor. We can read all three best when we put our gray mat on a dark floor. I don't know how this will translate in competition though. We might try increasing our accuracy above 50% and see how that affects the outcome.

    Leave a comment:

Working...
X