Code Programing

Just for fun

Tim Young
H-M Lifetime Diamond Member
Joined
Oct 7, 2020
Messages
2,113
How does one learn how to program code like what is used in the Clough42 ELS system? When I look through the program in Code Composer a lot of what I see is like reading Greek, ok, not really that bad but still. I'm not even sure what it is called?
 
I haven't looked at the Clough42 code, but the fact that you're using Code Composer leads me to believe that it's C or C++. With that, you can hunt down a course or book on C/C++ programming. (I recommend "Effective C" by Seacord). BUT it's a deep topic, be committed, or don't bother. If your only use is an ELS it's probably better to just copy what others have done.

I've actually been working on creating an embedded C course - aiming to release around April, but maybe later - it's taken a lot longer than I thought it would.

GsT
 
The Clough ELS uses C/C++.

If you want to start to learn C/C++ on micro-controllers, it's tough to find something easier than the Arduino ecosystem. It's very well documented, both in the hardware and software. Knowing some basic C syntax might be helpful, but not required. There are some online courses that cover it in detail. There are options for BASIC, Python, Java, and web based technologies as well. I think I would suggest C to start with, just because it's so common. If you decide to get some hardware, look at the Arduino Pro Micro. Tiny board, USB connection, very easy to program for and cheap.
 
If you have never programmed before, I would advise against your first language being C/C++

Maybe if you have one and only one use case, such as fiddling with the code that runs Clough42's ELS, you might be able to limit yourself to learning just what you need to know, but that's going to make understanding what mistakes you've made in your coding and how to fix them (and you will make mistakes) all the harder.

There are a fair few concepts to get your head around in programming in general, which are possibly better learned in a language designed for ease of use and learning. As much as I subjectively don't like its style, Python is many people's first language these days for good reasons (another being Javascript but that way madness lies!).

Sure, you'll probably learn a bunch of stuff you'll not think you'll ever use but the learning and practice will get you thinking somewhat like a programmer, and if you want to be able to write good, bug-free, maintainable code (and there's no such thing as code that never needs to be changed), that kind of mindset is important; doubly so, in a language like C++, where it is possible to write code that, merely days later, is unintelligible!

Another good thing about using something like Python to learn in is if after a decent amount of effort you find yourself still struggling to understand the basic concepts you can abandon the attempt, safe in the knowledge that coding is not for you. ;)
 
At least you didn't suggest learning on lisp or perl. :)

Nothing wrong with starting on a high level like python. I started with C, but I am weird.

Feel free to post if you get stuck. We have a few coders here that can help. More than I expected on a machining forum.
 
At least you didn't suggest learning on lisp or perl. :)

Nothing wrong with starting on a high level like python. I started with C, but I am weird.

Feel free to post if you get stuck. We have a few coders here that can help. More than I expected on a machining forum.
Common Lisp is my very favorite language... But, it has no place in embedded programming.

GsT
 
If you have never programmed before, I would advise against your first language being C/C++
For this particular use case, wouldn’t it actually make sense to learn C/C++ as the first language.

I’d encourage @Just for fun to go for it! It can be done and has been done. The O’Reilly learning platform’s 10-day free trial is a good staring point :encourage:
 
Last edited:
Took me awhile to grasp C. The classic book was Kernighan & Ritchie's "The C Programming Language". For a decade the language sort of tortured me. I would drag out K&R to get me through a pickle, but it didn't come easy. C++ is like an extension of C, with some really helpful features. Curiously, I learned a whole lot about C++, only after learning Python. Python is a higher level language that uses many of the object oriented concepts as C++. Python is a much, much easier language to start out in, because there's practically instant feedback if you mess up. Python is an interpreted language, rather than compiled, so you don't need to go through the compilation and linking exercise. It does have compiled optimized modules/libraries which have very fast execution times. I used these libraries at work to create radar simulations of substantial complexity, that had as good or better performance than MatLab, with zero cost. The downside of Python is it is not very good for real time machine control, as it's tough to get the same responsivity as a program in C or C++. Also, Python is not as available as the other languages for microcontrollers.

Getting back to C++. I learned the object oriented design concepts in Python. Because of this, the jump to C++ was not severe. About 15 years ago now, I needed to create a high performance machine to calculate out radar response to interference. The machine was remotely located in the facility because of it's cooling requirements. The fans were incredibly loud. Had to write a client server application that interfaced with the user. I knew nothing about client servers, and even less about network programming. After reading about how network transactions work for 3 days, I wrote a script in Python for both the client and server in one day. This is not because I was smart, or particularly talented, but because Python has incredible resources and libraries. The language claims "batteries are included". I was able to test it at my desk in a day, to work out the network transactions. To run on the remote server, I had to port the code to C++ because that platform did not have Python. I had never written a line of C++ code in my life. Using a colleague's book on C++, and my object oriented Python code, I ported to C++ in one day. Because the algorithm had been fully tested in Python, and did what it needed to do, it ran flawlessly in C++. Debug time, less than a day. This is the power of prototyping in a higher level language. As an aside, the server computed 32 million point double precision (complex) FFT's, in milliseconds.

C++, surprisingly was similar enough in many aspects to Python. I would have never been able to write C++ from scratch, at that time. Nowadays, it still doesn't come easy at all, although it is used quite often with Arduino like platforms. You can get by with C for quite a while before having to migrate to C++. But to be fair, alot of the librairies in Arduino are written in a C++ fashion.

I did write my own ELS code from scratch. Used a Teensy 4.1, a 600 MHz M7 Arm core processor, which is still a relative bargain for it's cost of about $30. The Teensy platform can be programmed in Arduino, or other similar platforms. It has an extensive set of libraries which are optimized for it's hardware. Of interest for an ELS is Teensy's onboard hardware quadrature encoders. In my case, I abandoned them for a specialized software encoder that provided more flexibility for my use. There is sufficient "horsepower" under the hood that the application works flawlessly for the entire operational envelop of the lathe. The Teensy is also driving a touch panel display which is the intuitive user interface. There are no switches or hard buttons.

One can probably use a lower capability processor and do this. I simply didn't want to put in the effort of creating my ELS and risk being limited by the platform. In my professional career, I was scarred by being forced to work with under powered processors by management that had no idea what the task truly required. They wanted to be heroes by beating on engineering to do the impossible. This resulted in a year schedule slip and lost market share, because the task truly was more complex than they could comprehend. A lot of math was required, and the processors simply could not execute the algorithms fast enough, nor was there sufficient memory to hold better algorithms. The management was replaced, a suitable processor was chosen, and the task completed in 4 months.

If you are effectively making a prototype system, why hobble your efforts for the sake of $20? Your development time, labor and other material costs will swamp the relative cost difference. It is a false economy for a one of, or hobbyist use.

If you know of others that are pioneering the capabilities of lower cost, yet capable platforms and want to follow along, great. The one I know of is working with a platform that requires a deeper knowlege of the silicon capabilities. Perhaps he will be successful. The original poster in the thread seems to have abandoned their effort using that platform. I am speaking specifically about the RPI Pico thread. I wish the most current poster the greatest of success.

For me, it simply was a risk analysis. What is the least cost method to get to the end, considering availability of parts, ease of implementation, the feature set I desired, and my current (and anticipated future) skill set? In my case, the Clough42 TI processor boards were unavailable, at any cost at the time (pandemic), I disliked the primitive user interface, and I found Jame's software to be somewhat sparse in documentation. So I did my own. Learned a lot along the way.

Suggest you learn some C, and maybe Python. Python, perhaps only for testing out algorithms or concepts, and simply aquiring a new way to rapidly test ideas. From a practical sense, I have found that Python is my goto to solve interesting problems that require computation, since it is so easy for me to use. My programming productivity is far superior using Python than C by a factor of at least 10X. C (and C++) seem to be the language of microcontrollers, for real time control. If you need any help along your journey, let me know, and I would be glad to help. Have learned a lot from others, so I'd like to pass some of that along, to the best of my ability.
 
There's a lot of danger in learning 'some C'. It's a fantastic language, and the lingua franca of the embedded world (though C++ is making more inroads all the time), but it has possibly more ways to shoot yourself in the foot than any language I've used - and I've used more than a few. As @WobblyHand noted, it's not the language for extreme productivity, it's the language of extreme economy in time and memory. Only assembly could be any more sparse. But, you really need to know the device you're programming, understand the different types of memory (storage duration) that you're using, and grok Undefined Behavior. A little C is a dangerous thing. You can put something together that appears to work - maybe today, or every other run, or for the next month or year before it suddenly stops working, and without a deep understanding (sometimes even *with* a deep understanding...) the reason can be incredibly hard to suss out. Don't get me wrong, I think learning C is great, but only if you commit to really learning C - a process that will take quite some time.

GsT
 
Back
Top