Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Servo and PWM doesn't work at the same time on the same timing peripheral #32

Open
reconbot opened this issue Oct 16, 2014 · 39 comments
Open

Comments

@reconbot
Copy link

I've been working on solving an issue with J5, spark-io, and voodoospark and have traced it down to the latest master of voodoospark having issues with servos and pwm pins at the same time.

rwaldron/particle-io#39 has some of my troubleshooting.

Since the version of voodoospark that comes with the spark-cli is precompiled I don't know what version it is or how to compare it to the latest master.

I also can't reliably reproduce which device, the pwm or the servo won't work. It's one or the other.

@rwaldron
Copy link
Collaborator

Is this a bug in voodoospark or hardware capability issue?

@reconbot
Copy link
Author

This is with no extra hardware just a pulse counting arduino. I also tried
with 2 generations of sparkcores. Both work with the CLI tools copy of
voodoo spark and not with master.
On Oct 16, 2014 9:01 AM, "Rick Waldron" [email protected] wrote:

Is this a bug in voodoospark or hardware capability issue?


Reply to this email directly or view it on GitHub
#32 (comment)
.

@voodootikigod
Copy link
Owner

That helps a lot I can go from there

On Thursday, October 16, 2014, Francis Gulotta [email protected]
wrote:

This is with no extra hardware just a pulse counting arduino. I also tried
with 2 generations of sparkcores. Both work with the CLI tools copy of
voodoo spark and not with master.
On Oct 16, 2014 9:01 AM, "Rick Waldron" <[email protected]
javascript:_e(%7B%7D,'cvml','[email protected]');> wrote:

Is this a bug in voodoospark or hardware capability issue?


Reply to this email directly or view it on GitHub
<
https://github.com/voodootikigod/voodoospark/issues/32#issuecomment-59357113>

.


Reply to this email directly or view it on GitHub
#32 (comment)
.

Chris Williams

@voodootikigod http://twitter.com/voodootikigod | GitHub
http://github.com/voodootikigod

Maker of Improbable Things: JSConf http://jsconf.com/ | RobotsConf
http://robotsconf.com/ | Beer.js http://beerjs.com | Logo.js
https://github.com/voodootikigod/logo.js | node-serialport
http://nodebots.io/

@rwaldron
Copy link
Collaborator

I'm confused, in the spark-io bug you said the precompiled voodoospark works fine, but not the latest master, but that's not reflected here

@rwaldron
Copy link
Collaborator

This is with no extra hardware

That's not what I meant. I'm thinking about available timers, PWM vs Servo, etc

@reconbot
Copy link
Author

I mean to say the precompiled version of spark that comes with the spark-cli works, and the latest master of voodoo spark (specifically at #163641e) is giving me issues.

@rwaldron
Copy link
Collaborator

I understand that part :P I only meant that you didn't include that info in this bug. Ok, moving on, I filed this: particle-iot/spark-cli#101

@zsup
Copy link

zsup commented Oct 16, 2014

@voodootikigod sounds like you're on it, but if you need any input from the Spark team, lemme know

@reconbot
Copy link
Author

It actually seems like this may be present on the version of voodoospark included with the spack-cli too. (What version is that?)

I'll have reduced test case a little later tonight.

@rwaldron
Copy link
Collaborator

2.2.0

@Resseguie
Copy link
Collaborator

@reconbot did you try logging the values being written by spark-io just before they are sent as suggested by @rwaldron in the original bug report?

rwaldron/particle-io#38 (comment)

I'm curious to see how many commands (and how often) are being sent. Too many commands at once has always been the culprit for me when I get the red SOS flashing. I've gotten it doing testing on the LED lib (e.g. pulse with short or no delay). I still suspect that might be the problem here.

If so, we might need to revisit the possibility of throttling of commands being sent, though I'm not sure how best to handle that. It's relatively straightforward for something like an LED (just set minimum delay) but much more complicated for motors or multiple devices at once.

@reconbot
Copy link
Author

I did, I'll do it again to give you some output. I don't think it's an issue with a flood of data. (unless 3 commands in a row can trigger the issue, I don't get a red flash of death in any case) What I expect is being sent to the spark, it's reading it from the network and outputing the correct values as serial debug statements, but the pins are not doing what we would expect.

@reconbot
Copy link
Author

Alright, I have a reduced test case.

board.on('ready', function(){
  var pwmPin = "A0";
  var servoPin = "A1";

  this.pinMode(servoPin, this.MODES.SERVO);
  this.pinMode(pwmPin, this.MODES.PWM);
  this.servoWrite(servoPin, 90);

  setTimeout(function(){
    console.log('pwm on');
    this.analogWrite(pwmPin, 200);
  }.bind(this), 5000);
});

On an arduino I'm monitoring pin A1.

Channel 1:1465
Channel 1:1465
Channel 1:1464
// PWM Pin on
Channel 1:61
Channel 1:61
Channel 1:61

And I have a serial logger on the sparkcore

Bytes Available: 3
Action received: 0
PIN received: 11
MODE received: 4
Bytes Available: 6
Action received: 0
PIN received: 10
MODE received: 1
Bytes Available: 3
Action received: 65
PIN: 11
WRITING TO SERVO: 90
// pwm on
Bytes Available: 3
Action received: 2
PIN received: 10
VALUE received: C8

Lastly, I uncommented the console logs (and annotated them) in spark-io's pinmode and *write functions.

pinMode <Buffer 00 0b 04>
pinMode <Buffer 00 0a 01>
write <Buffer 41 0b 5a>
pwm on
write <Buffer 02 0a c8>

@reconbot
Copy link
Author

I forgot the best part of a bug report;

I expected the servo pin not to change output when setting a pulse width on a different pin, the servo pin however dropped to almost a 0 degree angle.

@rwaldron
Copy link
Collaborator

Doesn't make sense:

PIN received: 10
MODE received: 1

Mode should be 3

@reconbot
Copy link
Author

ps. Those tests were with v2.2.0. If it helps I can repeat with master later tonight.

@rwaldron
Copy link
Collaborator

@rwaldron
Copy link
Collaborator

Actually, I don't know what that even means at the moment.

@rwaldron
Copy link
Collaborator

Re:

Doesn't make sense:

PIN received: 10
MODE received: 1
Mode should be 3

I'm wrong, it's correct to set the pin to regular output mode.

@reconbot
Copy link
Author

Well.. This may not be our problem.

This c program suffers from the same issues.

Servo s;

void setup() {
  Serial.begin(9600);
  s.attach(A1);
  pinMode(A0, OUTPUT);
}

void loop() {
  Serial.println("A1 Servo to 90");
  s.write(90);
  delay(5000);
  Serial.println("A0 PWM to 200");
  analogWrite(A0, 200);
  delay(5000);
}

@Resseguie
Copy link
Collaborator

@zsup then that might be something you all will want to look into from your end?

@zsup
Copy link

zsup commented Oct 21, 2014

My gut says that the reason for this issue is that A1 and A0 are both on the same timer peripheral, which means that one can't act as a PWM pin while the other is controlling a servo, because the timer patterns are different. @satishgn is that correct?

@reconbot
Copy link
Author

@zsup any idea of which pins might not be?

@zsup
Copy link

zsup commented Oct 21, 2014

see this document:

https://github.com/spark/core/blob/master/Pin%20mapping/core-pin-mapping-v1.xlsx

A0 and A1 are on the same timer peripheral (Timer 2), but A4, A5, A6, and A7 are on Timer 3, and D0 and D1 are on Timer 4

@reconbot
Copy link
Author

A4 and A0 have no interactions!

@reconbot
Copy link
Author

We should probably take this over to spark-io but a warning or error when running incompatible pin modes or pins that have interactions would be great. I'm not sure what I'm reading with the excel sheet, but I'd love to figure out exactly what doesn't work.

@rwaldron
Copy link
Collaborator

Spark-IO can't have special knowledge like that, does Firmata warn of similar things on an UNO?

@reconbot
Copy link
Author

Does this happen on the UNO?


Francis Gulotta
[email protected]

On Mon, Oct 20, 2014 at 11:40 PM, Rick Waldron [email protected]
wrote:

Spark-IO can't have special knowledge like that, does Firmata warn of
similar things on an UNO?


Reply to this email directly or view it on GitHub
#32 (comment)
.

@satishgn
Copy link

@reconbot, as @zsup pointed out, it's happening because both A0 and A1 channels belong to the same TIM peripheral. Servo works on fixed PWM freq of 50Hz whereas analogWrite() works on fixed PWM freq of 500Hz. Currently we don't have a check in place to warn about possible clashes between servo and analogwrite.

@reconbot
Copy link
Author

If we reported back to spark-io what hardware we had then maybe it would be in a position to provide a warning. But i see why its a bad idea now.

Can I assume that all the timing peripherals have this limitation?

@rwaldron
Copy link
Collaborator

@reconbot that's the type of thing that I would put in Johnny-Five, but how can we signal to Johnny-Five that there are hardware constraints? Can we make it a generalized mechanism that Firmata.js, Galileo-IO, etc can also implement?

@zsup
Copy link

zsup commented Oct 21, 2014

My gut reaction is that the correct way to do this would be to give each bit of hardware a json file that stores the hardware peripherals. Besides keeping the user from setting up a servo and analogWrite on the same timer peripheral, this could also do things like block the user from doing an analogWrite on a pin that doesn't have a timer peripheral at all. Perhaps the JSON would look something like this:

{
  "device": {
    "type": "spark-core",
    "version": "1.0",
    "pins": {
      "A0": ["Timer 2", "ADC"]
    }
  }
}

@reconbot
Copy link
Author

@zsup would it be reasonable to assume that some timing peripherals don't have this issue? I'm wondering if a mechanism that can return errors or warnings would be the right way to approach this. That way each *-io project can assume responsibility for reporting the limitations regardless of what they are.

Possibly with a new event? Though I suppose error would do the job.

Spark.prototype.pinMode = function(pin, mode) {
  // detect a conflict with pinmodes
  this.emit('warning', string_or_array_or_object_of_warnings);
}

@zsup
Copy link

zsup commented Oct 21, 2014

@reconbot every piece of hardware has different constraints, basically depending on how the peripherals are set up and exposed, so it's safe to assume there will be variability in the errors that need to be emitted. That said, I think it's safe to say that every piece of hardware has some constraints (peripherals available on some pins but not others), and maybe worthwhile to put the pattern of how to handle these constraints in the overall framework (johnny-five or firmata or both)

@rwaldron
Copy link
Collaborator

Perhaps the JSON would look something like this:

That example is reasonable starting place, but we'd still need to design aspects, eg. "conflicts" are defined. "A0": ["Timer 2", "ADC"] doesn't address the immediate issue, but works as a starting point.

@rwaldron
Copy link
Collaborator

Here's a thing... Galileo-IO needs to be able to warn you that Gen 2 boards only have 1 pwm period shared by all pins (yes... unreal); if you set up a PWM for analogWrite, then a Servo, I believe the last wins.

@reconbot reconbot changed the title Servo and PWM doesn't work at the same time with master branch Servo and PWM doesn't work at the same time on the same timing peripheral Dec 11, 2014
@Resseguie
Copy link
Collaborator

@rwaldron @reconbot was there a consensus on the best way to handle this? Is voodoospark the correct place to track it? Or should this be in either Johnny-Five or the specific IO plugins? Did you ever do something similar for devices like Galileo?

@satishgn @zsup I assume there are similar limitations for Photons now instead of Spark Core?

@zsup
Copy link

zsup commented Jan 24, 2016

Yes, there are similar limitations to the Photon, but they are different, because we use a different chip (although it's very similar - an STM32F205 instead of an STM32F103, so a souped-up chip in the same family). But again, every chip has limitations that are specific to how the peripherals are exposed by the chip and by the GPIO API you're using (Arduino/Wiring, mbed, etc.). @technobly or @m-mcgowan can comment on the specifics of Photon (and upcoming Electron) vs. Core

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants