Announcement

Collapse
No announcement yet.

Writing Robot Code without the Robot , with a single phone or on the Android emulator

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Writing Robot Code without the Robot , with a single phone or on the Android emulator

    The NullOp (no hardware) Configuration is a great place to get started for teams that do not have the new hardware yet. However each developer on a team still needs to have a second phone for the Driver Station App while developing Op Modes on the Robot Controller App. In addition, the NullOp configuration, although handy for testing telemetry between the phones, does not lend itself to writing Op Modes that process sensor input or write to motors. It is however a good place to practice Java and create mock Op Modes that can be later converted to Op Modes that work with hardware.

    Given these limitations, I decided to fork the FTC_APP project with the following goals in mind:

    1) Debug/Develop Op Modes on the Robot Controller App without the need for a second phone for the driver station.
    2) Allow for this development to work within the Android Studio emulator ( in case a single phone is not available to the developer)
    3) Allow the developer to code an Op Mode that is fully pluggable whether connected to the robot or using the emulator.
    This would have an advantage in a classroom situation as Op Modes can be coded with a single phone or emulator and without having to be converted before attaching it to the robot.

    I have an example now that demonstrates the IRSeeker Op Mode that can run from the Driver station app connected to a robot, or in "Null Hardware debug mode" run from the emulator.
    The following image shows a screen where the user can enter the values of the IR Sensor to test the Op Mode loop() method that processes the sensor input.
    Note this is a generic screen that scans the components defined in the Op Mode and dynamically renders the sensor input fields and motor outs. So anyone using this only needs to code OpModes classes.



    The code can be found here: https://github.com/levydev/ftc_app.git
    This fork can also be opened and run in Android Studio in the same manor as the original.

    If you think this has potential , please let me know what if any improvements can be made.

    Enjoy

    David Levy
    Mentor - Team 519 Epsilon Delta

  • #2
    Great! I think there is a definite need for more "offline" development and testing.

    I am trying to get a pre-configured single PC development environment operational, ideally with the driver Station Application connected to the controllers.

    One of the things I am struggling with is the WifiDirect connection: I have not been able to get the DriverStation to talk to the RobotController in a emulated (Android emulator or android.x86 in a VM) environment.
    It would be much easier if we could just use a "standard" TCP/IP connection. This would allow easy connection of the two simulators.

    Do you have any suggestions or ideas on that?

    Comment


    • #3
      Originally posted by pbrier View Post
      Great! I think there is a definite need for more "offline" development and testing.
      I am trying to get a pre-configured single PC development environment operational, ideally with the driver Station Application connected to the controllers.
      This thread is about changing the robot controller so Op Modes can be developed WITHOUT the need for the Driver Station App. I did this by wrapping the Hardware component classes supplied by FTC with my own classes to they can be used in this mode of operation.

      If I had the code for the Driver Station App , I'd probably solve your problem by swapping out the WifiDirect piece with a UDP client.

      Comment


      • #4
        Yes. I understand. But I see several scenarios that can be used for development (https://github.com/pbrier/ftc_app/wi...nt-and-testing). Yours is definitely a useful one.

        I also wished I had the Driverstation source, I have a device incompatibility that prevents me using the joystick with some of my devices. I think I know what the source of the problem is (some API call the driver station app uses to detect the controller that is only available in newer Android devices) and there possibly is a workaround but I cannot implement it without sources.

        I already had a look at the Wifi direct communication (https://github.com/pbrier/ftc_app/wiki/WiFi-Direct) and probably some UDP protocol is used, but possibly scrambled over a WifiDirect channel.

        Comment


        • #5
          I like your list of tested devices: https://github.com/pbrier/ftc_app/wiki/Tested-Devices

          Although I have not compiled a list, I'd imagine a most android phones should work with the fork that I created here :https://github.com/levydev/ftc_app.git .

          Although I'd recommend that the resulting developed Op Modes be run on FTC sanctioned devices during competition.

          Comment


          • #6
            The latest changes to https://github.com/levydev/ftc_app support screen input for all the supplied Op Modes.

            For a quick demo you can download the app directly to your phone here:
            https://www.dropbox.com/s/d7p5vog5z1...debug.apk?dl=0

            However, this is a tool for coding and debugging the Robot Controller from a single phone or the Android Emulator. In order to do that you will need to download the source from : https://github.com/levydev/ftc_app

            Comment


            • #7
              Improved Screens to Enter Test Sensor Data

              The screens have been improved to offer sliders to limit the sensor components to the ranges specified in the javadocs.

              You can quickly install the app and run through all of the OpModes here:
              https://github.com/levydev/ftc_app/r...ller-debug.apk

              It would be great if someone can run the app as I'd appreciate any comments.

              If you want to test changes to the existing Op Modes or create new ones , you'll need the source here:
              https://github.com/levydev/ftc_app
              Having the source will also let you fully develop off of the Android Studio Emulator. (Which is ideal for classroom exercises as no phones or robots are required)

              The following is a summary of the screens used to test various OpModes:
              (Note: OpModes can be chosen from the Drop Down Setting menu on the top right. )

              1) IR Seeker OpMode Sensor Inputs





              2) Tank Drive OpMode Gamepad inputs



              Again - feedback and questions are welcome.

              Comment


              • #8
                I have a few questions: does this use or work with the "True SDK" in the forum, or is this based off the official repo? Does the goal Android API level have a broader range, then the official repo (like Android 4.1.2-5.1 versus Android 4.4.2)?

                Comment


                • #9
                  Originally posted by dmssargent View Post
                  does this use or work with the "True SDK" in the forum, or is this based off the official repo?
                  It's presently a fork of the official repo.


                  Originally posted by dmssargent View Post
                  Does the goal Android API level have a broader range, then the official repo (like Android 4.1.2-5.1 versus Android 4.4.2)?
                  Yes. The goal was to allow it to be used on an emulator or a larger range of phones while developing. I can restrict it further if I think that a user is likely to code into an Opmode class something that won't work on the ZTE phone. Not sure if that is likely at the moment.

                  The goal here is to use this enhancement to fully develop the Opmodes and then switch to the "Run from Drive App" to run in competition. That should work work at FTC has specified ( with two phones and a connected robot).

                  Comment


                  • #10
                    Hi David,

                    Great work! I hope to try to get this running over the next few weeks.

                    I was wondering how easy or difficult it would be to add the following features (similar to what was done here for RobotC):

                    1. It would be great if autonomous routines could be tested on the robot without having a Driver Station device. For example, could you change your version of the app to have an OpMode selector and a "Start" button (like on the Driver Station). Previously, we've done this to test autonomous routines, where we "start" the routine, the Robot Device beeps 4 times (alerting the students that the program is about to run), and then it starts. This is really helpful for testing on a practice field without a Driver Station device. We also implemented an "abort" button that allowed the program to NOT run if the abort button was pressed while the 4 beeps were happening, in case the program was started by mistake.

                    2. It would also be great if there was a Robot Controller app mode that allowed on-device testing of each motor and servo. In previous years, we've used this in setup to check that each motor is wired correctly and that each motor and motor controller is working, and in practice or in the pits to test motors after a match that might be smoked or not moving. We step through the list of motors and then hit a button to move the motor forward or backward at full speed (or at a user selected speed). This can also be useful for raising or lower arms or lifts, etc., during maintenance. The same function is useful for manipulating servos and even for determining what the ideal servo setting is for a particular position (e.g., "what value do we program the servo to in order to grab the goal?"). This could be implemented with on-screen buttons/sliders/etc. on a Robot device app. It seems like your latest version has inputs simulating the gamepad: I wonder if you could make a similar version with inputs that just directly set the motor powers and servo settings. Again, see the example RobotC code at the link above for an example.

                    Thoughts? Or would it be better for these functions to be completely independent versions of the app?

                    Comment


                    • #11
                      Hi Cheer4FTC,

                      What I've provided is for the development and testing of autonomous / tele-op routines (Op Modes) that can be run without the hardware. What differentiates this from the Null OpMode solution recommended by FTC , is that these new OpModes can be run unchanged when connected to the hardware and switched back to debug mode any time after.

                      It sounds like what you are asking is for special testing in situations where the RobotController phone is attached to real hardware.

                      First let's consider the case where you cited the need to start and stop autonomous OpModes without a separate driver station phone. I think the best solution would be for FTC to change their Driver Station App so it can talk to the Robot Controller app on the same phone. Communication between apps is a very common use case in Android.

                      Second - You cited the need to test individual motors.
                      I suppose that can be done now by creating a special OpMode for each motor. The testing would of course have to occur with a separate driver station app connected to a Gamepad.

                      Lastly - You asked to be able to test the motors through screens similar to the one's I have created to simulate sensor and gamepad input when hardware is not available.
                      I think the best way to do this is on the Driver Station app and not on the Robot Controller app. (The Driver Station app running on a separate phone or to my earlier point- as a 2nd app alongside the Robot Controller.
                      My recommendation would be for FTC to release the source of the Driver Station app much as they did with the Robot Controller app. This would be partial source for just the screens. The Wifi-Direct usage would be confined to library files much in the same way as the Robot Controller app. This would give the ability for the user to create screens with controls to be used in lieu of the buttons on the gamepad.
                      In addition to satisfying your testing requirements, I think this would be a great option for teams to try during competition. Why force teams to use the Gamebutton buttons?

                      I'll leave with a prototype that I created in 2013 that does not require a gamepad. There is a good argument for coding the buttons on the screens but as you can see from the video - maybe not a great argument for replacing the joysticks

                      Comment

                      Working...
                      X