Took me awhile to grasp C. The classic book was Kernighan & Ritchie's "The C Programming Language". For a decade the language sort of tortured me. I would drag out K&R to get me through a pickle, but it didn't come easy. C++ is like an extension of C, with some really helpful features. Curiously, I learned a whole lot about C++, only after learning Python. Python is a higher level language that uses many of the object oriented concepts as C++. Python is a much, much easier language to start out in, because there's practically instant feedback if you mess up. Python is an interpreted language, rather than compiled, so you don't need to go through the compilation and linking exercise. It does have compiled optimized modules/libraries which have very fast execution times. I used these libraries at work to create radar simulations of substantial complexity, that had as good or better performance than MatLab, with zero cost. The downside of Python is it is not very good for real time machine control, as it's tough to get the same responsivity as a program in C or C++. Also, Python is not as available as the other languages for microcontrollers.
Getting back to C++. I learned the object oriented design concepts in Python. Because of this, the jump to C++ was not severe. About 15 years ago now, I needed to create a high performance machine to calculate out radar response to interference. The machine was remotely located in the facility because of it's cooling requirements. The fans were incredibly loud. Had to write a client server application that interfaced with the user. I knew nothing about client servers, and even less about network programming. After reading about how network transactions work for 3 days, I wrote a script in Python for both the client and server in one day. This is not because I was smart, or particularly talented, but because Python has incredible resources and libraries. The language claims "batteries are included". I was able to test it at my desk in a day, to work out the network transactions. To run on the remote server, I had to port the code to C++ because that platform did not have Python. I had never written a line of C++ code in my life. Using a colleague's book on C++, and my object oriented Python code, I ported to C++ in one day. Because the algorithm had been fully tested in Python, and did what it needed to do, it ran flawlessly in C++. Debug time, less than a day. This is the power of prototyping in a higher level language. As an aside, the server computed 32 million point double precision (complex) FFT's, in milliseconds.
C++, surprisingly was similar enough in many aspects to Python. I would have never been able to write C++ from scratch, at that time. Nowadays, it still doesn't come easy at all, although it is used quite often with Arduino like platforms. You can get by with C for quite a while before having to migrate to C++. But to be fair, alot of the librairies in Arduino are written in a C++ fashion.
I did write my own ELS code from scratch. Used a Teensy 4.1, a 600 MHz M7 Arm core processor, which is still a relative bargain for it's cost of about $30. The Teensy platform can be programmed in Arduino, or other similar platforms. It has an extensive set of libraries which are optimized for it's hardware. Of interest for an ELS is Teensy's onboard hardware quadrature encoders. In my case, I abandoned them for a specialized software encoder that provided more flexibility for my use. There is sufficient "horsepower" under the hood that the application works flawlessly for the entire operational envelop of the lathe. The Teensy is also driving a touch panel display which is the intuitive user interface. There are no switches or hard buttons.
One can probably use a lower capability processor and do this. I simply didn't want to put in the effort of creating my ELS and risk being limited by the platform. In my professional career, I was scarred by being forced to work with under powered processors by management that had no idea what the task truly required. They wanted to be heroes by beating on engineering to do the impossible. This resulted in a year schedule slip and lost market share, because the task truly was more complex than they could comprehend. A lot of math was required, and the processors simply could not execute the algorithms fast enough, nor was there sufficient memory to hold better algorithms. The management was replaced, a suitable processor was chosen, and the task completed in 4 months.
If you are effectively making a prototype system, why hobble your efforts for the sake of $20? Your development time, labor and other material costs will swamp the relative cost difference. It is a false economy for a one of, or hobbyist use.
If you know of others that are pioneering the capabilities of lower cost, yet capable platforms and want to follow along, great. The one I know of is working with a platform that requires a deeper knowlege of the silicon capabilities. Perhaps he will be successful. The original poster in the thread seems to have abandoned their effort using that platform. I am speaking specifically about the RPI Pico thread. I wish the most current poster the greatest of success.
For me, it simply was a risk analysis. What is the least cost method to get to the end, considering availability of parts, ease of implementation, the feature set I desired, and my current (and anticipated future) skill set? In my case, the Clough42 TI processor boards were unavailable, at any cost at the time (pandemic), I disliked the primitive user interface, and I found Jame's software to be somewhat sparse in documentation. So I did my own. Learned a lot along the way.
Suggest you learn some C, and maybe Python. Python, perhaps only for testing out algorithms or concepts, and simply aquiring a new way to rapidly test ideas. From a practical sense, I have found that Python is my goto to solve interesting problems that require computation, since it is so easy for me to use. My programming productivity is far superior using Python than C by a factor of at least 10X. C (and C++) seem to be the language of microcontrollers, for real time control. If you need any help along your journey, let me know, and I would be glad to help. Have learned a lot from others, so I'd like to pass some of that along, to the best of my ability.