Needing more than a spark test?

Now things are getting very interesting!
 
Is the "straight line" part simply that the ring buffer length ran out shortly after the peak? No matter, so long as we get the peak! :)
I don't think so, the circular buffer is 512 values deep. At 360KSPS that's 1422 samples. microseconds But that falling edge doesn't look very much like exponential decay, does it. Since I want to use the pulse area to get the best possible noise rejection the pulse shape DOES need to conform to reality.

I'll have to look more closely at the code to see if there's something wrong with it. It wouldn't be the first time!

Edit: changed the circular buffer depth to microseconds. Clearly a buffer that's 512 long can't be 1422 samples deep at the same time!
 
Last edited:
But that falling edge doesn't look very much like exponential decay, does it.
No it doesn't. Something isn't right. It's like some signal is missing, or some samples got rejected.
 
Some steps forward, some back. I "improved" a number of things and the plotted pulses subsequently looked like hell. One issue I found was that I was double-incrementing the index into my pulse buffer -- the index is the "i" variable in a for(i=0; i < something; i++) {} loop and then I did something dumb like this inside the loop : pulse_buffer[i++] = some-value. The result of skipping every other entry was intermixing old and new data. Not a good way to preserve the shape of a pulse.....

The other problem, possibly causing that falling-edge issue, is that I have been incorrectly using the trigger-level value to establish the limits of the pulse. Clearly, a pulse that significantly exceeds the trigger level has a "life" before and after that. Well, more bug-smashing is in the works.

Getting late tonight. I'll attack it some more tomorrow.
 
This one really drove me up the wall. I tried all kinds of different approaches and nothing produced any decent-looking pulses, even though my pulse qualification code was saying things were looking good. I started getting suspicious about the serial plotter tool in the Arduino IDE and it turns out that there were two problems happening. The first is that I was trying to dump lots of data in a tight loop into a fairly small serial buffer. It naturally overflowed. The second is that all evidence suggests the serial plotting code isn't particularly fast, so trying to plot data at something faster than 9600 baud causes it to misbehave.

After addressing those issues I'm seeing much more reasonable-looking pulses. Not all the time so there may still be something going on with either my code or that pesky serial monitor. I'm inclined to dump it for something better.

Anyway, here's a reasonable looking pulse captured by my current S/W:

XRF pulse2.JPG
I can't plot the entire pulse due to limitations with the serial plotter. If you look at the x axis you see it's showing just a few over 40 samples. Not sure what's going on there -- documentation for the serial plotter is just about nonexistent so I don't know if it is possible to change the horizontal resolution or not. The plot window is completely filling the screen so I can't get any better than that. At this point anyway.
 
Don't know if this is useful or not, but it seems to be a python based plotter for this sort of use. Should run on linux, macs & windows. I have had to do something similar because the built in serial plotter is very limited and not well documented. Seem to remember a buffer length of 500 or something weird like that. I had 1K FFT's I wanted to look at!
 
Don't know if this is useful or not, but it seems to be a python based plotter for this sort of use. Should run on linux, macs & windows. I have had to do something similar because the built in serial plotter is very limited and not well documented. Seem to remember a buffer length of 500 or something weird like that. I had 1K FFT's I wanted to look at!
I found that one too. I may take a closer look at it, but I also just bought a little standalone TFT display from Adafruit that has decent resolution, a reasonable price and only needs an SPI interface. And there's a library available for it.

One way or the other, I will get to the root of this current problem... if it's real. I'm not convinced it is, but it will be nice to have another way to troubleshoot complex data acquisition/processing problems.
 
While Python code can be persuaded to play nice, it takes some skill to make it so, especially when it comes to loops in code with other loop conditionals. Avoid in favour of Java, or C++ if possible. I do not say this as a expert in programming, because, despite that I have at some stage meddled with most computing languages and platforms starting with 6502 hex code, through FORTRAN, Pascal, C, C++, etc. ending up with Python and Java, that is my weak area. I tended to suck at all of them!

This was advice from my son, who definitely is programming professional, with a math Masters from Exeter. He wrote an entire satellite system data management and tracking control suite that runs 24/7 in dozens of places on the planet. He points out the difficulties with Python, and why It takes deep experience to avoid Python code running slow, and with a variable loop execution time.

In Raspberry Pi, various Python tools and development environments are useful as a fast way to get up a final application, and publish code other experimenters can just load up and use, and Python can, if done right, be very powerful, but I have had trouble making it so. I have an old A/D converter Pi "hat" add-on that was supposed to control a signal generator chip to make a sweep signal analyzer. The Python code ran so slow it was simply dysfunctional. Admittedly, the originator should not have used loops within loops the way he did, but it took my son only minutes to get up (Java) code that was running at several MHz!

Python is handy as controller that relies on calling up much faster, better code which handles critical areas. PyMCA will be OK, because it analyzes and presents the plots as a separate, post-processing action working on files already assembled by the data gathering code. It is not part of a time-critical thing, and does not snatch execution time from real-time sampling processes in a way that makes the high priority loop execution time variable.

When we attempt to be snatching data on a continuous basis, it is important that the timing loop be constant, and have a portion that departs to execute little pieces of slow stuff like a plot display someone is looking at, or mouse input someone is doing. We won't notice it, because the loop completes in something less than 5mS. This is not something that interpreter languages like Python or BASIC are good for. That is not to say don't use ithem They are great for fast, easy development of the user interface. Just don't let them be in the way of critical timing data logging code.

The approach of letting the sampling and stashing functions interrupt the display plotting code should work just fine.
What Mark is attempting is extremely challenging, and I am in awe of it!

Why is Python so Slow?


Why People Use Python Even If It’s Slow
 
While Python code can be persuaded to play nice, it takes some skill to make it so, especially when it comes to loops in code with other loop conditionals. Avoid in favour of Java, or C++ if possible. I do not say this as a expert in programming, because, despite that I have at some stage meddled with most computing languages and platforms starting with 6502 hex code, through FORTRAN, Pascal, C, C++, etc. ending up with Python and Java, that is my weak area. I tended to suck at all of them!

This was advice from my son, who definitely is programming professional, with a math Masters from Exeter. He wrote an entire satellite system data management and tracking control suite that runs 24/7 in dozens of places on the planet. He points out the difficulties with Python, and why It takes deep experience to avoid Python code running slow, and with a variable loop execution time.

In Raspberry Pi, various Python tools and development environments are useful as a fast way to get up a final application, and publish code other experimenters can just load up and use, and Python can, if done right, be very powerful, but I have had trouble making it so. I have an old A/D converter Pi "hat" add-on that was supposed to control a signal generator chip to make a sweep signal analyzer. The Python code ran so slow it was simply dysfunctional. Admittedly, the originator should not have used loops within loops the way he did, but it took my son only minutes to get up (Java) code that was running at several MHz!

Python is handy as controller that relies on calling up much faster, better code which handles critical areas. PyMCA will be OK, because it analyzes and presents the plots as a separate, post-processing action working on files already assembled by the data gathering code. It is not part of a time-critical thing, and does not snatch execution time from real-time sampling processes in a way that makes the high priority loop execution time variable.

When we attempt to be snatching data on a continuous basis, it is important that the timing loop be constant, and have a portion that departs to execute little pieces of slow stuff like a plot display someone is looking at, or mouse input someone is doing. We won't notice it, because the loop completes in something less than 5mS. This is not something that interpreter languages like Python or BASIC are good for. That is not to say don't use ithem They are great for fast, easy development of the user interface. Just don't let them be in the way of critical timing data logging code.

The approach of letting the sampling and stashing functions interrupt the display plotting code should work just fine.
What Mark is attempting is extremely challenging, and I am in awe of it!

Why is Python so Slow?

Why People Use Python Even If It’s Slow
I think it's a bit of misinformation to state that python is always slow. If you use the numpy and scipy libraries it isn't. It's at least as fast as Matlab. Both those libraries are vectorized and have been optimized in C++. So like Matlab you can do element by element multiplication or matrix math of any sort, including SVD and stuff like that. (Singular Value Decomposition)

I ran full up radar simulations with loads of physics models in Python/numpy/scipy in my job. Dozens of moving targets and calculating statistics on the quality of the results. Using numba you can have computation on GPU's. What's in python's favor is it's ease of development. I'm at least 5-10x more productive using python than C. Of course it's not for real time processing. I've often prototyped algorithms in Python to validate they work and ported them to lower level languages. Prototyped a client server system that way in a single day to get it to work. I had never done any network programming before. (Both client side and server on two different computers.) Then ported to C++ the second day, without knowing C++. The object oriented design of both languages was so similar that even my dense self could pull it off. Had to use C++ on the server so I could take advantage of OpenMP. Was running 52 cores in parallel, 48 of which were running FFTW. This was before the days of GPU's.

You can get pretty far with python before throwing in the towel. That's if you use the optimized libraries. At least that's my experience.

All that being said, I've only used C and C++ on a Teensy. It's pretty darn capable for it's size and power. I'm using one in my own ELS. Wrote the code from first principles and it's seriously one of the nicest additions to my lathe. I hated changing gears on my lathe since often little pieces would go flying to parts unknown and handling the gears was always messy. Now that mess and inconvenience is in the past. Highly recommend going with an ELS system of any sort.
 
Back
Top