News:

SMF for DIYStompboxes.com!

Main Menu

Bass pedalboard?

Started by R.G., October 27, 2013, 03:58:11 PM

Previous topic - Next topic

Resynthesis

Quote from: R.G. on October 29, 2013, 08:42:11 PM

Once you get over the hump of writing well documented, well thought out code, it's about as easy as hardware. CMOS logic is easy - until one starts playing tricks with it and using side effects, especially undocumented ones, or depending on the quirks of one manufacturer's products. Then it gets dirty too.

Back when I first took FORTRAN, about 1970, I thought the program statements were the program. It took a few decades to come around to the view that the program exists as a set of operations free of any programming language, very nearly the stuff of thinking itself, and that written-down programs are just more-or-less accurate translations of that set of ideas into one or another language. Accordingly, if the thinking is clear, the translation is easy and straightforward. If the thinking is muddy, no language will help.

Have you ever used formal methods such as Z, B or CSP? If not, the idea is specified in an abstract way using formal logic and so you can reason about and prove properties of your formalism. In some domains this leads to direct generation of source code which should be provably correct (if your compiler and hardware is provably correct of course). The more generally applicable halfway house is to use a tool such as SPARK Ada which lets you instrument your code to ensure your formally defined conditions are met by the implementation. When I was a lad SPARK cost an arm and a leg but there's a GPL version now.

Anyway, back to out normal programming (yes, pun)

R.G.

Quote from: amptramp on November 04, 2013, 08:15:32 PM
Please tell us what your approach is to the top octave - I love hearing about the various methods of synthesizing a top octave.  Unless, of course, you are doing something proprietary.
Proprietary? Hardly. You take a high enough clock frequency so that an integer number of clock ticks divides into the non-integer period (or half-period) of the desired frequency. When that's elapsed, you increment the note register by one. The LSB increments at the top octave frequency, or one octave down if you had to count full periods. Successively more significant bits are higher octaves, and you get them all at once.

The trick is making the errors caused by using a finite number of clock ticks per cycle or half-cycle be well under the necessary precision for the human ear to hear it as being non-perfect.

The math to do that is not complicated, just repetitive, and is more simply done by seeing what frequencies (crystals) are available to you, then running the spreadsheet calcs again.

The approach of doing internal division ratios more or less directly gets complex quickly. Running multiple odd-ratio dividers is how the classic TOG chips worked. It's a good approach if you only have hard-logic gates and flipflops to work with.

Old Crow's TOG used either a fast timer or isochronous code to do this. He only got a "top" octave of four octaves down from the actual top one. I haven't looked back at his code to  see if you could make a real top octave generator by simply using blinding-fast clock rates.

I hit on the idea of making one note per chip and several octaves down from that. It's not only almost trivial to code, it winds up saving a top octave generator chip - the "divider" chips each generate their own top note as well as all the lower octaves too, from the same clock that runs all of them.

Once you have the top-note-plus-octaves engine, you get other things. For instance, it's good if the same code generates any note; just tell it which of the 12 tones to do, and it does. So a whole 96 notes simultaneously can be done with 12 chips plus programming jumpers on each one. And once you have that, you might be able to replace the jumpers with a keyboard matrix scanning routine to read 12 or 13 switches; if you can, then the same chip will generate any of the notes on a key stroke - provided of course that you find places to stick the keyboard scanning in between the time-critical note-transition events.

A key-matrix scanner lets you do any kind of key priority in software as long as you don't insist on more than one note per octave out of the same chip. But it eats up pins. Going to a precision resistor plus switches lets you use a single A-D converter pin and read all of the switches at once, still living with the only-one-key-works limit. This gets you back to small (14 pin DIP) uCs and have pins left over for things like an octave select switch and stacking-handshake pins.

Quote from: Resynthesis on November 04, 2013, 08:52:08 PM
Have you ever used formal methods such as Z, B or CSP? If not, the idea is specified in an abstract way using formal logic and so you can reason about and prove properties of your formalism. In some domains this leads to direct generation of source code which should be provably correct (if your compiler and hardware is provably correct of course). The more generally applicable halfway house is to use a tool such as SPARK Ada which lets you instrument your code to ensure your formally defined conditions are met by the implementation. When I was a lad SPARK cost an arm and a leg but there's a GPL version now.
No, I've not used any of those. I *was* extruded through some instruction where we were taught to do the low levels of provably-correct code, with the whole proof-of-correctness being the desired end result. So I'm familiar with the ideas.

This was a long time ago. As I dimly remember, this was after the Apple II and before the IBM PC - probably closer to the PC.

I suspect that it was in the infancy of "correctness" in programming before the tools were there. It's a neat idea, and of course does need cased up inside some greater tool to take care of what I found to be dreary proofs on each statement or fragment of statements.

The whole issue of trustworthy programming is critically important, but I think it went to sleep in the explosion of personal computers. As recently as the early 2000s I was managing a kernel development/maintenance department for significant port of Unix. Proof of correctness as a programming technique/adjunct was never even mentioned.

And I think that something like "trustworthiness" is a better objective than "correctness". Correctness is a necessary precondition to trustworthiness, but it's not nearly enough, as the malware writers of the world become increasingly employees of governments and organized crime (if you make a distinction between those two things  :icon_eek: ). What we need is a system of programming formalisms that allow you to prove there are no buffer overrun exploits and/or backdoors.

You're right, this needs provably correct/trustworthy compilers and hardware. [As an aside, hardware and a suite of compilers are two halves of the same thing. One without the other is badly crippled, and a matched set is wildly synergistic.]

In time, I think we'll get that. Not for people, as the general public is viewed as a farm to be plowed and harvested by both governments and organized crime, but from the military and/or banking. After the first government falls to hacking breakins or after the first few international banks are wiped out by some exploits, there will be enough interest to make this a priority. Eventually that will trickle down to the masses. I won't see it trickle down, as it will be more decades than I have left. It is of course a matter of money; But there's hope, as the identity theft "business" is now larger than the global drug trade. Lots of money to be made or lost means the development issues get more affordable by comparison, and more fear and greed to motivate the development.

So in lieu of getting things secure or provably correct, I harped on my programmers to have good code hygene.
R.G.

In response to the questions in the forum - PCB Layout for Musical Effects is available from The Book Patch. Search "PCB Layout" and it ought to appear.

Resynthesis

Absolutely, I come from a hard real-time, safety critical (mainly avionics)  background and, obviously, these things are important even to the extent of verified compilers and hardware. I do think that some of these ideas should get into more mainstream development. However, just the mention of formal methods instils fear in many, so-called, engineers. Maybe if languages such as Eiffel would have been more widely used we'd be in a better place :icon_neutral:

amptramp

Quote from: Resynthesis on November 05, 2013, 10:51:34 AM
Absolutely, I come from a hard real-time, safety critical (mainly avionics)  background and, obviously, these things are important even to the extent of verified compilers and hardware. I do think that some of these ideas should get into more mainstream development. However, just the mention of formal methods instils fear in many, so-called, engineers. Maybe if languages such as Eiffel would have been more widely used we'd be in a better place :icon_neutral:


I was not involved in software, but I was our company delegate to various conferences such as Ada/JUG back in 1988 in NYC and places such as the Software Engineering Institute at Carnegie-Mellon University in Pittsburgh where Dr. Lui Sha was doing original work on Rate Monotonic Scheduling.  Part of the problem with programming is making sure it does the correct job and the other part is to ensure that real-time performance is bounded (known as deterministic) so that you can actually determine the range of times it takes for an asynchronous event to be recognized.  (The only reason I was selected as our delegate was that I take very good minutes of the meetings.  As far as software goes, I claim only enough knowledge to be dangerous.)

I understand the idea of verified compilers but at the time, some manufacturers of Ada compliers had only shipped a small number of copies - I remember on company saying it was one of the biggest with an installed base of 36 users!  Our senior programmers were not impressed and said that although formal verification does afford some guarantees, a good C compiler with thousands of users will find bugs more thoroughly than a formal verification process.

The DoD was trying to promote the use of Ada as their standard language at the time.  At the Ada/JUG meeting (the JUG was Jovial Users' Group where Jovial was the original military programming language), they glossed over the non-deterministic nature of the Ada rendezvous (which was the method of switching context) and didn't impress many people.  They started with about six people on stage, all talking amongst themselves and interrupting each other, but the real tone of the meeting became obvious when one of the people on stage mentioned "reusable code" as an advantage of Ada and one person in the audience interrupted them and asked, "What good is reusable code without a reusable spec?"  It was the first time I had seen a meeting of about 100 people go dead silent for a 5 full seconds.  Then one of the people on stage mumbled, "Who let him in here?  Somebody call security."  It's nice to know that the people who are supposed to be in charge of defending democracy don't believe in it themselves.

Resynthesis

Yes, there were many problems with the spec for hard real time, hence subsets such as the Ravenscar Ada profile was devised and incorporated into Ada's 2005 spec. The university where I worked was deeply involved with Ada and with the avionics industry and, I think, did a lot of decent work in getting verified systems into industrial use.

While you're right that users find bugs in compilers, the problem is when they find them. In avionics applications, you want your code / compiler / hardware to work first time and not expose a subtle bug when you're 6km up in the air! There are pretty decent verified compilers around (including C) these days. One major problem now is how to make verification portable across architectures - this includes functional correctness of the code as well as non-functional requirements for safety, temporal behaviour, etc.

Ada shouldserve as a lesson and we should now be trying to avoid the same pitfalls with approaches such as Real Time Java.