9 Replies Latest reply on Mar 11, 2014 7:38 AM by JJY

    delayMicroseconds large overhead?

    JJY

      I am new to Galileo and Arduino - trying out the following code,


      void loop()

      {

        Serial.print(micros());

        Serial.print("\n");

        for(int i = 0; i < 10000; i ++){

            delayMicroseconds(10);

        }

        Serial.print(micros());

        Serial.print("\n");

      }

       

      It takes about 1.8 seconds to go through -  each delayMicroseconds takes about 180 microseconds to complete instead of 10 microseconds? How can I get better resolution here? Thanks!

        • 1. Re: delayMicroseconds large overhead?
          Clayton Hofrock

          Does the same problem occur if you increase the delay to 100 microseconds?

           

          I think there is a point where if the delay is small enough, it is no longer accurate.

           

          For Arduinos, this minimum delay is 3 micros seconds. Arduino - DelayMicroseconds For Galileo I think it is higher, but I am not sure of the number. It should not be too hard to experiment with different delays and find the threshold.

          • 2. Re: delayMicroseconds large overhead?
            JJY

            Thanks for replying, I can confirm that the same code runs in 0.101 second on an Arduino UNO. Trying different delays suggests a ~175 microsecond overhead on an Galileo for each call to delayMicroseconds, so the delayMicroseconds function is not implemented correctly for the Intel board. I am using the latest firmware for Galileo.

             

            Knowing this does not help with my problem of getting ~microsecond delay resolution though.

            • 3. Re: delayMicroseconds large overhead?
              Clayton Hofrock

              It might be more accurate to say the threshold for the delayMicrosecond function is about 3 micro seconds for Arduino, and the threshold is about 175 us for Galileo.

               

              That is a significant difference.

              • 4. Re: delayMicroseconds large overhead?
                JJY

                I don't want to assert anything, but I am afraid I cannot agree with your opinion here... given the function's name and its performance on an Arduino board.

                • 5. Re: delayMicroseconds large overhead?
                  Len

                  The Galileo is running an operating system hosting an environment for the Arduino software. The Ardunios are running on a bare board so getting repeatable results with delay microseconds will not be possible.

                   

                  I was trying to drive three servos and update the pulse width sent to them every 200 milliseconds. Delay was useless and what I wound up doing that worked was to update a variable with the milliseconds since start of the application, then compare to that plus 200 and when I went over, update the pulse width and reset the variable to millis, do my calculations and anything else that was going on and loop. I would print out the elapsed time when I finished and found quite a variability in the time it took to execute.

                  • 6. Re: delayMicroseconds large overhead?
                    HLane


                    Like JJY, I'm new to this world --- and also discovereding (by a
                    slightly different route) that it's apparently impossible to write any
                    code at all that occupies less than milliseconds...not what I naively
                    expected from a 400 MHz processor! For example, the following program
                    generates a square wave on my 'scope --- with a period of 4.5
                    milliseconds.
                    ...................................................
                    /* Minimal Pin toggler */
                    void setup() {
                      pinMode(8, OUTPUT);
                    }

                     

                    void loop() {
                       digitalWrite(8, LOW);
                       digitalWrite(8, HIGH);
                    }
                    .......................................

                     

                    Now 4.5 mS x 400 MHz = 1,800,000, unless I'm doing my arithmetic
                    completely wrong ... is is really true that this little loop consumes
                    nearly two million ticks?

                     

                    In agreement with Len's comments, I find that the time cxonsumed in
                    this loop is not very stable: if I display even 4 cycles of the waveform of
                    my 'scope, there is considerable jitter -- maybe .5 mS --- at the end
                    of the 4th cycle.

                     

                    It seems to me unlikely that there's any solution to this problem in
                    the context of Arduino/Galileo. Maybe Arduino alone, or bypass
                    Galileo's Arduino interface?

                     

                    -- Homer

                    • 7. Re: delayMicroseconds large overhead?
                      AlexanderMerz

                      "or bypass Galileo's Arduino interface?"


                      Then you are still on top of a multi-user, multi-tasking(!) operation system, kernel drivers and several other software layers. Always remember: The Intel Galileo ist not just a microcontroller with some resistors on a PCB like the Arduino, the Galileo is a full featured computer board (yeah except a graphic controller ).

                      • 8. Re: delayMicroseconds large overhead?
                        Len

                        To get an idea of what goes on under the covers, post 7 to the following link has some great references:

                        https://communities.intel.com/thread/48059

                         

                        I supported the Parallea Kickstarter and one of the early recipients is using a Raspberry Pi as a graphics accelerator. I think it is possible to do the same with the Galileo if desired:

                         

                        https://www.youtube.com/watch?v=-lCWlu1EnsM

                         

                        The Galileo is quite a capable board, but it is more useful as an brain of a system than a low level hardware driver. The code I have implemented to drive a delta robot manipulator is fairly complex and full of floating point math, but it gets it done with some time to spare in a 200 millisecond time slice and it is driving the servos. If it was attempting to generate the pulse width without PWM support in the hardware it would be hopeless.

                         

                        I once implemented a vision character recognition system using an early Pentium. The only reason it worked was that the code all fit in the cache. The Galileo is a bit more than five times faster than that processor.

                        • 9. Re: delayMicroseconds large overhead?
                          JJY

                          HLane, the problem you are facing can actually be solved using fast I/O supported by Galileo. See

                           

                          Re: I/O speeds?

                           

                          And Len, thanks for the replies. There is no doubt that Galileo is a beefier board. But I am trying some simple things (generating hardware control signals) and hoped Galileo could do at least as good (with the same ease) as doing the same on an Arduino board. It is a small piece of a large puzzle I am trying to deal with so unfortunately I won't be able to spend much time on every small piece.