TAC Scorpion revival

Back in the days when everybody was using some Soundcraft n*200 (n = 1,2,3,4..) mixingdesk I used to prefer a TAC scorpion console. We are talking end '80 begin '90 here..

So this is my, way to much work, project to investigate why I did prefer that console over others and at the same time make it a bit more 2022.


Now this is most certainly not how these consoles looked, so lets start with a picture of a original TAC Scorpion:

This is not my original console, that one has disappeared into oblivion. But very similar to the one shown on the left, all VU meters where broken. They all had that after a year or 2..

So instead of trying to fix those first thing I did was get rid of some heavy metal casing. Boy are these things made like heavy armoured tanks.

I do like VU meters on every channel though! Not as much to adjust gaining (o, man I can rant on about that one). But they really come handy to quickly see which synth/sequencer/voc channel is doing that solo in a live situation😏

So I made a VU board with a vintage chip everybody knows: LM3915. As there is no audio  passing through this chip I didn't care where it came from: Aliexpress FTW! I did design a simple pcb in cool black and here you go diy VU meter:
I mounted them to the faders which I had to clean anyway. And I used the same feed point for fader and the measuring entry (as in PFL), so no long lines running across the console..





..to be continued

STM32 in final 'product'

Let me write a round up of this research on using a MCU as a speaker processor.

By this time I switched from KEILmdk to STM32CubeIDE which at this moment (november, 2021) seems to be the weapon of choice for most STM developers.

Again for future reference a quick how to on how to include a math / DSP library.

Together with your installation of the IDE you also will have installed some repositories for ARM software products. You are looking for CMSIS/DSP libraries. The path will be something like  /STM32Cube/Repository/STM32Cube_FW_F4_V1.25.2/Drivers/CMSIS/DSP/ in your home directory (in Linux). It will be similar in MacOS or Windooz.

Start a new project in the IDE, right click on projects name and Create new source folder. Name this folder DSP. Now again right click and select Import file system. Select both the source (source) and the header (include) file as in the screenshot from the above mentioned 'repository'. Click finish.          

 

Now you will have both the source and header files in your project but your compiler knows nothing about it so: right click project and find the project properties setup. Find the compilers include path settings and add the new path to your include file. Check the screenshot.

Now you can include all these nice ARM DSP tricks by:#include "arm_math.h" Try to build (that's embedded speak for compile)... Hundreds of errors, but if you check the firsts you will see that the compiler needs two PreProcessor directives. Find those in properties and add both: ARM_MATH_CM4 (we are using a M4 core) and __FPU_PRESENT = 1U (yes, we do have a floating point dohickey in our processor). O, and that's two/2 underscores! __


Now you can start programming your code making use of block processing, taking chunks of say 48 samples using the increased efficiency of the CMSIS libraries. 

Check all those filtering functions!!

There's loads to discover but let me disclose this one: 

Decades ago I met Ed Long (yup, the guy who 'invented' Time-alignment) at a proaudio trades fair. He demonstrated his closed box, single 15' sub. That was astounding (as was the excursion). His trick was to use electronics branded as ELF and not filters to EQ the sub. 


Much later Linkwitz developed a filtering method known as Linkwitz Transform. A similar way to correct the response of a sub woofer. 

Never got any good results with this method as in professional DSP's don't provide for a method to import your own Biquad coefficients. And using shelving filters and EQ will never have the required precision.


While using this CMSIS library function:

void         arm_biquad_cas_df1_32x64_q31 (const arm_biquad_cas_df1_32x64_ins_q31 *S, const q31_t *pSrc, q31_t *pDst, uint32_t blockSize)
 Processing function for the Q31 Biquad cascade 32x64 filter.

I managed to reproduce that sensation I had at that trade fair decades ago..Thanks Ed!

Do understand that you really, really need the above precision (in fixed point) to get any decent results! If you are into the math: understand how a biquad calculation works and how the representation of coefficients (which are all close to the unity circle) will effect the calculations at very low frequencies. 

(anyone remember we at some point (2010?) got this 'use for better LF response' check box in LondonArchitect? )

Posted on Instagram you can find some pictures of cabinets where I did apply this.

----->

Oh, and this is the latest board I developed, everything being stalled a bit by the global chip shortage:




Practical FIR

Part 3

As ever so often in engineering, if some new technology is introduced we go into  'nec plus ultra' modus. Suddenly you will find a huge range of applications where all and unique is about that new tech. Same for FIR filtering. 

The first appearance of FIR in (diy) audio was as a means to do room EQ-ing. The idea was to measure a systems response in a room, do quite some averaging and apply an overall EQ that would correct all and everything. First there is of course no such thing as room EQ: the only way to change the acoustics of a room is by changing it's dimensions and/or changing the absorbing/reflecting surfaces. But the solution is better then using only conventional (min. phase) EQ as we have been doing with our 31b. graphics.

Some of the room effects are min. phase and thus one could use a graphic to try to address the problems. The definition of min. phase behaviour dictates that your ideal EQ point will have both the necessary magnitude and phase compensation. 

So for these min. phase phenomena the EQ will NOT introduce 'PHASE ISSUES' (crinch).

However. Loads of stuff you would like to EQ in acoustics is NOT min. phase. Remember playing a festival and during nightfall HF content of your PA changes drastically? And changing your graph. EQ doesn't help in the way you wanted? Well that's a non min.phase phenomena and this would call for a lin. phase EQ, which as we know now is only possible with FIR.

 

Another example would be (conventional) EQ-ing across a line array. Hopefully everybody does understand you can't apply different EQ settings to different elements in your line array, right? The different phase behaviour of the different elements will in that case wreak havoc on the overall dispersion of the array! The only way you can get away with this is by using lin.phase EQ. Something that French manufacturer of brown boxes does understand.

So when and where to apply lin.phase FIR filters as a means of filtering without affecting phase is something that needs thought!

It is not the miracle solution for everything, it is just another tool and it does have pro's and con's like any other tool.

So yes, as has been explained on million places: FIR filters can be computational heavy, but why should that be mentioned over and over as a problem? 

A more interesting difference, if you really would like to set IIR versus FIR, is the way the precision of the calculations work out sonically. Fixed point vs floating point. Internal bit depth. The way rounding errors work in your algorithm. Audibility of the pre-rings (or not?) That sort of thing. In the MCU/DSP part of this blog I will write a bit about this.

If all this is cleared, one big topic still is to be debated:

IS PHASE AUDIBLE?

(or more precise does applying an allpass filter change the way (stereo) audio is percieved?)

One will understand that I (and my peers) do have an opinion contrary to that of the big names in our industry. This is the internet: I can't demonstrate, but I encourage you to find out for yourself!