problem with time base settings over 200ms/div

Ask any questions relating to the PicoScope hardware or kit contents here.
If you have any questions prior to purchasing the kit post them here.
Post Reply
Paul Danner

problem with time base settings over 200ms/div

Post by Paul Danner » Sat Jun 18, 2005 12:13 am

I am using the automotive pico kit (ADC-212). When viewing signals with a time base of 200ms/div or more my waveforms become distorted. Today i was looking at a square wave signal on a single trace that was no more than 4 or 5hz. On a 100ms/div (1 second total screen) time base the signal looks perfect. If I increase the time base to 200ms/div the square wave starts to look like triangles with lots of distortion. I am somewhat familiar with record points and scope limitations, but i could use some help.
The first time I had this problem was performing a relative compression test with time base and current levels being set up automatically by the scope. Only after dropping the time per/div (the automatic setting is 200ms/div) was i able to see clean starter current "humps".
This has been a problem for me since day 1 and I have just now recognized the pattern for the problem.
Thanks in advance!!!

User avatar
Posts: 357
Joined: Wed Nov 20, 2002 4:19 pm
Location: Washington State USA


Post by Autonerdz » Sat Jun 18, 2005 5:45 pm

Hi Paul,

Use Block Mode.

At time bases greater than 100ms/div the internal ADC buffer is not being utilized in Standard or Chart Recorder mode. With those settings, the ADC streams to the PC at a very slow rate no faster that 1ks/sec. This means you don't have adequate sample rate. It does not matter what the samples are turned up to.

Using Block Mode will store the samples into the ADC buffer independently at full sample rate and then transfer the info to the PC in blocks. This will maintain robust sampling at the longer time bases.
Tom Roberts
(The Picotologist)

Post Reply