You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dummy OutputPin/InputPin implementations would be useful for implementing optional I/O pins. A dummy OutputPin would simply discard any value written to it, while a dummy InputPin would always return a constant value.
Motivation
One very common use for a dummy OutputPin would be chip enable pins, which for some chips are optional. It could also be useful for data/command, read/write etc pins that the particular circuit doesn't need and has them tied to a constant voltage.
Currently the neatest way to implement an optional I/O pin I can think of is to use a no-op trait implementation.
Even though it would be very simple to just code this, I'm opening an issue because I think the following things should be discussed first:
Is this efficient?
This should be pretty efficient for the common use case of chip enable or read/write pins. The calls into empty functions should be easy to optimize out, there's no need to use an Option which may increase memory use or cause runtime overhead, and driver implementations don't even have to take the optionalness into account and can just assume a pin is there.
Additionally what comes to using an Option there's also an issue with using it for an optional pin, because Option<T> requires some type for T even if you only ever plan on having a None, so you need a dummy OutputPin type anyway.
However this probably doesn't make it easy for a driver to "detect" whether a pin has been dummied out or not, so if that makes a difference (for instance if the driver can omit an expensive operation that's needed to figure out what to write to the pin), that case needs to be handled differently.
Is this safe?
This in theory makes it possible for users to end up with a broken driver instance because they dummied out a pin that is actually required to use the driver. However this is pretty much equivalent to forgetting a PCB trace, and users should be aware of what pins are really needed for their thing to function. A driver could signal optionalness by for instance having the pin as an optional argument in a builder style interface.
Does this belong in embedded-hal?
All drivers could also have their own struct DummyOutputPin, but I think it would make sense to have the dummy implementation directly in some crate. This would probably be the first real "trait implementation" in this crate, so I'm not sure if it's out of scope. Having an entirely new crate just for a couple of dummy implementations sounds like a hassle though.
The text was updated successfully, but these errors were encountered:
Thanks @eldruin for providing this crate. Are there any plans to make this built-in in the embedded-hal ? This seems like a very useful feature commonly needed.
I'm writing a driver for a SPI ADC, and I actually have to add chip selects even though I dont need them for my test board (chip select is managed by the peripheral).
Dummy
OutputPin
/InputPin
implementations would be useful for implementing optional I/O pins. A dummyOutputPin
would simply discard any value written to it, while a dummyInputPin
would always return a constant value.Motivation
One very common use for a dummy
OutputPin
would be chip enable pins, which for some chips are optional. It could also be useful for data/command, read/write etc pins that the particular circuit doesn't need and has them tied to a constant voltage.Currently the neatest way to implement an optional I/O pin I can think of is to use a no-op trait implementation.
Even though it would be very simple to just code this, I'm opening an issue because I think the following things should be discussed first:
Is this efficient?
This should be pretty efficient for the common use case of chip enable or read/write pins. The calls into empty functions should be easy to optimize out, there's no need to use an
Option
which may increase memory use or cause runtime overhead, and driver implementations don't even have to take the optionalness into account and can just assume a pin is there.Additionally what comes to using an
Option
there's also an issue with using it for an optional pin, becauseOption<T>
requires some type forT
even if you only ever plan on having aNone
, so you need a dummyOutputPin
type anyway.However this probably doesn't make it easy for a driver to "detect" whether a pin has been dummied out or not, so if that makes a difference (for instance if the driver can omit an expensive operation that's needed to figure out what to write to the pin), that case needs to be handled differently.
Is this safe?
This in theory makes it possible for users to end up with a broken driver instance because they dummied out a pin that is actually required to use the driver. However this is pretty much equivalent to forgetting a PCB trace, and users should be aware of what pins are really needed for their thing to function. A driver could signal optionalness by for instance having the pin as an optional argument in a builder style interface.
Does this belong in embedded-hal?
All drivers could also have their own
struct DummyOutputPin
, but I think it would make sense to have the dummy implementation directly in some crate. This would probably be the first real "trait implementation" in this crate, so I'm not sure if it's out of scope. Having an entirely new crate just for a couple of dummy implementations sounds like a hassle though.The text was updated successfully, but these errors were encountered: