Have questions about FETS, transistors, measurement, power supplies, or anything else electrical?

User avatar
By johnwillyn
#91393 I have an electrical conundrum that I can't figure out, explain or otherwise workaround.

I have a Wemos d1 mini wired up to a DS18B20 temperature sensor in accordance with the various tutorials found all over the web.

Specifically, I have a 4.7k Ohm resistor between the power and the data pin on the sensor. I am using the OneWire interface on the Arduino.

I need to "extend" the cable on the sensor to about 25' (8m or so), which according to most posts should not be a problem. Based on what I have read, I chose to use Cat 6 cable to reduce interference and because I can get it in "direct bury" version for outdoor use. I am planning to use this as a pool temperature sensor.

But I also want to power this from a 12v DC power supply, which will also be used turn on/off a motorized valve (at some point), and it seemed reasonable to use a single power supply to power all of the devices.

Here is the power source conundrum that I am hoping one of the electrical gurus can help explain.

I have the prototype working, with 25' of Cat 6 attached, and when I have the Wemos plugged into any of my USB power adapters (powered directly from house line voltage) everything runs fine, and the readings on the sensor are what you expect (25d to 26d C). I have tried several different desktop USB sources (chargers, power strips, etc.), and they all seem fine (stable readings).

When I switch over the the 12v DC power supply, and use it to drive a 5V USB voltage regulator (such as this one from Amazon Buck Converter with USB ports), then my sensor starts to flake out, and I get (a lot of) readings like -127C.

The 12v power supply seems to be delivering a consistent 11.9v DC, and the buck converter (I have several) seems to deliver 5v DC consistently.

Some of the suggestions I found on various forums said that I might need a lower value for the "pullup" resistor. I have been experimenting with this, starting at 2.2k, then down to 1k, then down to 660 Ohms (two 330 Ohms in series), and each step to a lower value seems to help stabilize the readings I am seeing.

I have it running with a single 330 Ohm resistor right now, and it is the most stable I have seen the readings on the 12v power supply and a buck converter USB power source.

What do you think might be causing this? Should I just run with the 330 Ohm pullup resistor in this configuration? Will it harm anything longer term?

Is this a timing thing (the sensor does not have enough time to drop the pin low)?

Do you think it might be "noise" introduced by the power supply?

Many of the basic power circuit concepts are new to me, and I would like to learn from this experience.

Thanks in advance for any help or suggestions.

JohnL