SallySoundFX

Plugins SallySoundFX

Sally Whitemane

Well-known member
Featured Contributor
Messages
263
Reactions
948
Points
93
Patreon
sallyw
Sally Whitemane submitted a new resource:

SallySoundFX - Audio Visualization Plugin

SallySoundFX is designed to automate visualizations based on audio currently playing in VAM.
It can listen to a frequency and flash a spotlight with every beat for example.
It can be linked to anything that accepts 'values Actions' from 0.0 to 1.0.
It can use a microphones too.

By default VAM can do only basic audio visualizations based on the volume. (As far as I know)
Therefore SallySoundFX is more accurate. A 'SallySoundFX Demo'-scene is included:
[ATTACH...

Read more about this resource...
 
Does this plugin happen to need windows 10 to run? I have the latest version of VAM installed on windows 7 and it unfortunately didn't work, I only got an error message saying something along the lines of how the .cslist file couldn't be found
 
very nice. is it possible codewise to make it pick up webpanel audio?
 
Does this plugin happen to need windows 10 to run? I have the latest version of VAM installed on windows 7 and it unfortunately didn't work, I only got an error message saying something along the lines of how the .cslist file couldn't be found
In theory there is no reason for the Plugin not to run on Windows 7 as long as VAM runs normal.
But I build and tested it on Windows 10 only.
Before I set up a Virtual Machine to test this:
Can you please check that it is not a compilation error because MacGruber.LogicBricks.12 was not installed?
It's the only dependency. VAM lists a second one, but that one comes preinstalled.
If it is not the following problem due to the missing 'LogicBricks' please send me a private message with a copy of your error message from the error log.
Then I'll test in a VM.
Code:
!> Missing addon package MacGruber.LogicBricks.12 that packageSally.SallySoundFX.1 depends on
!> Compile of Sally.SallySoundFX.1:/Custom/Scripts/Sally/SallySoundFX.cslist failed. Exception: System.Exception: Path MacGruber.LogicBricks.12:\Custom\Scripts\MacGruber\LogicBricks\Internal\MacGruber_Utils.cs not found
  at MVR.FileManagement.FileManager.ReadAllText (System.String path, Boolean restrictPath) [0x00000] in <filename unknown>:0
  at MVRPluginManager.SyncPluginUrlInternal (.MVRPlugin mvrp, Boolean isFromConfirmDenyResponse) [0x00000] in <filename unknown>:0
!> Compile of Sally.SallySoundFX.1:/Custom/Scripts/Sally/SallySoundFX.cslist failed. Errors:
very nice. is it possible codewise to make it pick up webpanel audio?
Thanks.
I am looking into this right now.
Unfortunately it does not look very promising.
The Plugin picks up the audio data from an AudioListener.
This Listener is always attached to the Camera:
VAM audio log.png

(That is the camera that renders the 3D scene for the monitor, not the 'WindowCamera'-Atom)
This makes sense since the user should hear 'less' if the audio source if far away from the camera.
Basically spatial 3D audio.
The WebPanel and similar objects like the TV seem to play the audio directly and ignore the spatial concept.
You can test this in VAM:
Music playing on an AudioSource will become quieter the further the user moves away.
Music playing on a WebPanel does not.

VAM always seems to load a hidden 'TestAudioSource'.
I tried to get the data from it too -> nothing.

So for now - the answer is: No, not possible.

Since I haven't figured out yet how the WebPanel actually does play the audio there is still hope.
If I do, maybe there is another way to get the required data. Chances are very low though.

The audio has to come from the 'Scene audio' or AptSpeaker / AudioSource / RhythmAudioSource.
Note that by default the AptSpeaker-Atom has spatial audio disabled.

I'll report back if I find out more.
 
Last edited:
In theory there is no reason for the Plugin not to run on Windows 7 as long as VAM runs normal.
But I build and tested it on Windows 10 only.
Before I set up a Virtual Machine to test this:
Can you please check that it is not a compilation error because MacGruber.LogicBricks.12 was not installed?
It's the only dependency. VAM lists a second one, but that one comes preinstalled.
If it is not the following problem due to the missing 'LogicBricks' please send me a private message with a copy of your error message from the error log.
Then I'll test in a VM.
Code:
!> Missing addon package MacGruber.LogicBricks.12 that packageSally.SallySoundFX.1 depends on
!> Compile of Sally.SallySoundFX.1:/Custom/Scripts/Sally/SallySoundFX.cslist failed. Exception: System.Exception: Path MacGruber.LogicBricks.12:\Custom\Scripts\MacGruber\LogicBricks\Internal\MacGruber_Utils.cs not found
  at MVR.FileManagement.FileManager.ReadAllText (System.String path, Boolean restrictPath) [0x00000] in <filename unknown>:0
  at MVRPluginManager.SyncPluginUrlInternal (.MVRPlugin mvrp, Boolean isFromConfirmDenyResponse) [0x00000] in <filename unknown>:0
!> Compile of Sally.SallySoundFX.1:/Custom/Scripts/Sally/SallySoundFX.cslist failed. Errors:

Thanks.
I am looking into this right now.
Unfortunately it does not look very promising.
The Plugin picks up the audio data from an AudioListener.
This Listener is always attached to the Camera:
View attachment 99090
(That is the camera that renders the 3D scene for the monitor, not the 'WindowCamera'-Atom)
This makes sense since the user should hear 'less' if the audio source if far away from the camera.
Basically spatial 3D audio.
The WebPanel and similar objects like the TV seem to play the audio directly and ignore the spatial concept.
You can test this in VAM:
Music playing on an AudioSource will become quieter the further the user moves away.
Music playing on a WebPanel does not.

VAM always seems to load a hidden 'TestAudioSource'.
I tried to get the data from it too -> nothing.

So for now - the answer is: No, not possible.

Since I haven't figured out yet how the WebPanel actually does play the audio there is still hope.
If I do, maybe there is another way to get the required data. Chances are very low though.

The audio has to come from the 'Scene audio' or AptSpeaker / AudioSource / RhythmAudioSource.
Note that by default the AptSpeaker-Atom has spatial audio disabled.

I'll report back if I found out more.
Great info, and happy hunting (I gave up my quest for it) - whomever controls webpanel audio, controls the universe, its more or less the holy grail as it would make dynamic online Text-To-Speech > headaudio > mouth movement, models dancing to youtube etc possible, it`s next level shit.

Another way would be some sort of proxy, fetching sound (ex the output from a youtube url or tts url server > server) serverside online (not local server - too much to ask from the user to install local shit) and have that server spit out a format the scene-audio can stream as-if-one-inserted-a-URL-there as I know rythmAudioControllers are able to fetch that
 
Last edited:
@God
To find out how audio is handled for the Web-stuff I started renaming some .dll-files.
Then started VAM with a WebPanel test-scene to see if it still works. The scene was playing a youtube video.
It is weird that VAM has so many audio-library files btw.

I tested these suspicious files until I broke the WebPanel:
VAM\VAM_Data\Managed\
Bass.Net.dll Version 2.4.13.2 http://www.bass.radio42.com http://www.un4seen.com -> nope still working
NAudio.dll Version 1.8.4.0 https://markheath.net/category/naudio -> nope still working
UnityEngine.UnityWebRequestAudioModule.dll -> nope still working
UnityEngine.WebModule.dll -> nope still working


VAM\VAM_Data\Plugins\
bass.dll -> nope still working
AudioPluginMsHRTF.dll -> nope still working
AudioPluginOculusSpatializer.dll -> nope still working
libmpg123-0.dll -> nope still working

zf_cef.dll <- Yep that is it!
ZFProxyWeb.dll <- Yep that is it!


This leads to the audio being handled by ZenFelcrum. A embedded Browser.
https://zenfulcrum.com/browser/docs/Readme.html#zfbrowser-documentation

I will end my search here. Unlike a proper Audio-library this Chromium based Browser is unlikely to offer a function to get sample-data. Audio is not even mentioned in the documentation, except for some supported codec. And assuming it would offer such function, I still could not grab it from a Plugin because there is no Interface/Connection to it. End of road.

So the only option is sending the data to the Plugin from an an external source. That's bad. The user would have to install extra stuff and it would never be in sync perfectly. Assuming the WebPanel plays a musicvideo and has some lag, but the external source to analyze the audio has none it would desync ... and since there is no way to read the video-progress - how to resync it? *ugh*

Another idea would be to have a Windows program record whatever VAM is playing in Windows and silently send it back into a fake microphone device. That device can then be picked up by VAM/Unity and played on a AudioSource. However this AudioSource would have to be super silent/far away - or else feedback loop :D This would at least have no desync issues and little lag. But still installing external software is *ugh*. :poop:

Anyway - all solution kind of suck. 😕 I'm out of 'good' ideas now.

Edit: This is a log of me iterating over relevant objects and component to proof the 'WebPanel' (and similar) do not use UnityEngine.AudioModul at all. That basically means getting the sample data from there is impossible:
Code:
>>>> All VRWebBrowser Objects <<<<
VRWebBrowser: BrowserGUI

VRWebBrowser.transform.parent: Canvas
isRootCanvas: True
pixelRect: (x:0.00, y:0.00, width:1920.00, height:1080.00)
renderMode: WorldSpace
component type/name: UnityEngine.RectTransform / Canvas
component type/name: UnityEngine.Canvas / Canvas
component type/name: UnityEngine.UI.CanvasScaler / Canvas
component type/name: UnityEngine.UI.GraphicRaycaster / Canvas
component type/name: CanvasSizeControl / Canvas
component type/name: IgnoreCanvas / Canvas

VRWebBrowser.transform.parent.parent: rescaleObject
component type/name: UnityEngine.Transform / rescaleObject
component type/name: SetTransformScale / rescaleObject
component type/name: MaterialOptions / rescaleObject

VRWebBrowser.transform.parent.parent.parent: object
component type/name: UnityEngine.Transform / object
component type/name: MeshVR.PresetManager / object
component type/name: MeshVR.PresetManagerControl / object
component type/name: UnityEngine.Rigidbody / object
component type/name: ForceReceiver / object
component type/name: UnityEngine.ConfigurableJoint / object
component type/name: PhysicsMaterialControl / object
component type/name: CollisionTrigger / object

VRWebBrowser.transform.parent.parent.parent.parent: reParentObject
component type/name: UnityEngine.Transform / reParentObject

VRWebBrowser.transform.parent.parent.parent.parent.parent: WebPanel
component type/name: UnityEngine.Transform / WebPanel
component type/name: Atom / WebPanel
component type/name: PrefabEvolution.EvolvePrefab / WebPanel

VRWebBrowser.transform.parent.parent.parent.parent.parent.parent: SceneAtoms
component type/name: UnityEngine.Transform / SceneAtoms
 
Last edited:
error log.png




The error on loading the demo scene only shows the "sallysoundFX.cslist doesn't exist" error, but upon reloading (or loading from scratch) the plugin itself, the MacGruber dependency issue comes up, which is strange because the file definitely is there and in the right location as well
 
@Nameless Vagabond
I did test the Plugin inside a Windows 7 Virtual Machine using Virtual Box.
Except for it running slower inside the VM it worked perfectly normal.

From the screenshot I cannot tell whether the explorer window shows the content of a .var file or normal folder.
If the explorer shows your VAM\Custom-folder, the installation is wrong.

The correct paths are:
VAM\AddonPackages\Sally.SallySoundFX.1.var
VAM\AddonPackages\MacGruber.LogicBricks.12.var


I'm not sure whether your explorer is maybe configured to show the content of .var-files like a zip-file - but I doubt it.
If you extracted content from .var-files to VAM\Custom or VAM\Saves\ you must remove theses files or else this will cause all kinds of trouble.
(actually the Installation is kind of messed up in that case)

Did you install the Plugin using VAM Hub or manual?

If you are 100% sure your paths are correct maybe there is a folder permission problem. Could be if you VAM is saved to a users folder like the 'desktop'.
But in that case other Plugins should fail too. I'd move VAM to another drive then.
 
Last edited:
Oh BOI!
While digging deeper into in VAM's Assembly References I made a very interesting discovery today!

There is a (unknown) version of RhythmTool inside VAM:
https://assetstore.unity.com/packages/tools/audio/rhythmtool-15679#publisher
https://hellomeow.net/
https://forum.unity.com/threads/rhythmtool-music-analysis-for-unity.270866/

It seems to be to used by the RythemXXX-Atoms in VAM. Btw. I cannot find a documentation for how the Rhythm-Atoms work at all.
I just don't like how it seems to waste CPU cycles by using a 'fftWindowSize' of 4096! This is overkill!
Decompiling RhythmTool.Update() would be interesting to know how it processes the data it probably grabs from GetSpectrumData() in there.

Decompiled it. Turns out it uses a custom LomontFFT and not a Unity GetSpectrumData()-call. No idea how good and fast that thing is. It's *seems* to be optimized for high precision (double), high quality data at and not fast real time calculations.
My idea was to potentially 'steal' the data from RhythmTool and save CPU-time for the GetSpectrumData()-call. But the data is in private double[] array. Also it may or may not be processes by the FFT-algorithm yet depending on whether my Plugin or RhythmTool get a Update()-call first.

This is from poking around in a scene with a MP3 playing on a 'RhythmAudioSource'
Code:
RhythmTool.fftWindowSize: 4096
RhythmTool.frameSpacing: 1500
RhythmController.name: RhythmSource
RhythmController.rhythmTool: RhythmTool
RhythmController.rhythmTool.Initialized: True
RhythmController.rhythmTool.LastFrame: 1308
RhythmController.rhythmTool.CurrentFrame: 1009
RhythmController.rhythmTool.TotalFrames: 12658
RhythmController.rhythmTool.Lead: 300
RhythmController.rhythmTool.Interpolation: 0.7772827
RhythmController.rhythmTool.analyses.Length: 3
analysis.name: Low
analysis.advancedAnalysis: False
analysis.Frames.Length: 12658
analysis.start: 0
analysis.end: 12
analysis.name: Mid
analysis.advancedAnalysis: False
analysis.Frames.Length: 12658
analysis.start: 30
analysis.end: 200
analysis.name: High
analysis.advancedAnalysis: False
analysis.Frames.Length: 12658
analysis.start: 300
analysis.end: 550
 
Last edited:
Thanks for playing tech support for me - I did manually extract the VARs into my custom and save folders, a habit that I developed from the teething days of the VAR system.

Deleting everything and putting the VARs into the addonpackages folder worked - this plugin is seriously cool!
 
hello! Dayum, found your amazing plugin and demo scene on monster shinkais environment where he mentions you!
I love this plugin you made very much!!!

question:

in the demo scene I wanted to add more lights or objects that moved along with the music but I couldnt do it, Like adding a very similar neon light to the scene, It didnt change its lighting to the rhythm / was default 1 light .

looking forward to your help _D
 
The demoscene triggers lights not directly. Here I trigger the visible UI-sliders from the plugin first and they trigger lights and more.

Let's for example assume you have added an "InvisibleLight" and now you want to trigger it with the Plugin.

1) Open the SallySoundFX customUI and click the red [Trigger 1 Actions]
2) [Add Transition Action] > [Settings...]
3) Here you set up what Atom should be 'connected' to the trigger. For light you can control multiple things like range, color or intensity. For this example let's use intensity to have the light flash based on the triggers value.
Receiver Atom: InvisibleLight
Receiver: Light
Receiver Target: intensity
4) Now set the (Start) intensity-slider to 0.0 and the (End) intensity-slider to 1.0 or higher if you want to 'amplify' the triggervalue
5) [OK] > in complex scenes you should add a descriptive trigger name now, that's optional > [Done]
6) Play music and use the red [Trigger 1 frequency in Hz]-slider to find a good source-frequency for the trigger. Note that moving this slider is visually shown with a red bar in the spectrum to make this easy.

Leaving the Trigger 1 frequency at around 100 Hz this should give you a flashing light with every beat. Probably the most common use. Technically this could be used for other things like speech lipsync too. But I think there are better, more specialized Plugins for that.

For people new to VAM here is a guide on how to add this Plugin to your own custom scene:
https://www.patreon.com/posts/62558733

Unfortunately I am kind of stuck with the development of this Plugin.
My original plan was to use SallySoundFX as sort of "Master"-Plugin and then have many optional Visualization-Plugin that the user can load on demand. Like that dancefloor, except it's not being loading with a .cslist file - but as separate Plugin. What I did not take into account is the fact that it's not possible to set up a C# interface to communicate between Plugin as far as I know.

Currently I do have a private version with much better signal processing for visualizations. It does no longer use fixed amplification values for the frequency-bands used by the dancefloor. It can adjust itself to any music now automatically.
 
The demoscene triggers lights not directly. Here I trigger the visible UI-sliders from the plugin first and they trigger lights and more.

Let's for example assume you have added an "InvisibleLight" and now you want to trigger it with the Plugin.

1) Open the SallySoundFX customUI and click the red [Trigger 1 Actions]
2) [Add Transition Action] > [Settings...]
3) Here you set up what Atom should be 'connected' to the trigger. For light you can control multiple things like range, color or intensity. For this example let's use intensity to have the light flash based on the triggers value.
Receiver Atom: InvisibleLight
Receiver: Light
Receiver Target: intensity
4) Now set the (Start) intensity-slider to 0.0 and the (End) intensity-slider to 1.0 or higher if you want to 'amplify' the triggervalue
5) [OK] > in complex scenes you should add a descriptive trigger name now, that's optional > [Done]
6) Play music and use the red [Trigger 1 frequency in Hz]-slider to find a good source-frequency for the trigger. Note that moving this slider is visually shown with a red bar in the spectrum to make this easy.

Leaving the Trigger 1 frequency at around 100 Hz this should give you a flashing light with every beat. Probably the most common use. Technically this could be used for other things like speech lipsync too. But I think there are better, more specialized Plugins for that.

For people new to VAM here is a guide on how to add this Plugin to your own custom scene:
https://www.patreon.com/posts/62558733

Unfortunately I am kind of stuck with the development of this Plugin.
My original plan was to use SallySoundFX as sort of "Master"-Plugin and then have many optional Visualization-Plugin that the user can load on demand. Like that dancefloor, except it's not being loading with a .cslist file - but as separate Plugin. What I did not take into account is the fact that it's not possible to set up a C# interface to communicate between Plugin as far as I know.

Currently I do have a private version with much better signal processing for visualizations. It does no longer use fixed amplification values for the frequency-bands used by the dancefloor. It can adjust itself to any music now automatically.

Thank you very much!!!!
 
your plugin is amazing! I hope you make a beat-saber kind of thing someday, that would be awesome.

Unfortunately I am kind of stuck with the development of this Plugin.
My original plan was to use SallySoundFX as sort of "Master"-Plugin and then have many optional Visualization-Plugin that the user can load on demand. Like that dancefloor, except it's not being loading with a .cslist file - but as separate Plugin. What I did not take into account is the fact that it's not possible to set up a C# interface to communicate between Plugin as far as I know.

In case it helps I've done communication between plugins by using text atoms as proxies (like serialized data) and also by altering the public JSONStorableString fields of pluginA from pluginB sending data and triggering the callbacks attached to them this way. A more conventional approach probably might be to send the MVRScript object (slave) to the master object, i think MacGrubber is doing something like that in his macgruber_utils.cs (logicbricks) for reference. There's also this that might help https://docs.unity3d.com/ScriptReference/Component.BroadcastMessage.html, i think AcidBubbles does something like that in his devtools plugin for reference.
 
@SPQR
Thanks! That is very helpful.
I like the BroadcastMessage idea because this does not depend on anything except Unity.
Need to do some research and tests how the performance is for each solution.
 
What would be the best way to try and get BPM out of this, or is it already in there and I am missing it? If the trigger is at the right freq then I get the beat, but not sure how to use a variable and time to get bpm, without modding the code.

Any suggestion?
 
SallySoundFX only sends the live signal-value for a frequency. Technically you can modify the code to measure BPM. But then you'd have to play it for a while first to detect the BPM. I don't think this would be very good.

To get a BPM value before playing a music file it's probably better to hook into the VAM integrated RhythmTool.
It does analyze file while loading. It creates 3 very basic frequency band profiles - low, mid and high.
So looking into that low-profile may be worth it to get the BPM.

Either way - you'd have to touch code. I'm busy with GoDot game development atm. Got an unfinished Plugin Version 2 here for months now.

RhythmTool: no live signal, low precision with 3 frequency bands with preprocessed data
SallySoundFX: allows live signal (microphone), high precision with 128 frequencies accessible, 8 bands internally for visualization
RhythmTool precision is technically higher while it analyzes with a FFT size of 4096 and SallySoundFX only 512. But all that precision is lost by using only the 3 frequency bands.
 
@MonsterShinkai @SPQR @FighterCNash @C&G-STUDIO

Releasing SallySoundFX 2 quietly for now on my (free and public) Patreon.
Just in case I F'ed up something I will wait before I upload this to the Hub.
This will hopefully be the foundation for many cool Visualizations.

SallySoundFX 2.00 Release

I've tested long enough. Only visual bugs remain and performance is good. Everything "should" work. 🙏
Feedback welcome of course! And please be critical! I can take it. :cool:

Hope this does not disappoint people because it still looks 'basic'. The potential of all the crazy stuff you can do with this counts. If I would have VR hardware I'd try to make a beat saber Plugin ... that's complicated, but possible. However it would never be 100% the same, since whatever object you spawn to slash is generated from a 'live' signal. Just some thoughts ...

Todo-list: more visualizations *doh*, custom shaders, reflective materials, learn about advanced mesh rendering for even more performance

This was the first "working" version a few days ago. Later I found out it leaked memory like crazy. (fixed now ofc)
 
@MonsterShinkai @SPQR @FighterCNash @C&G-STUDIO

Releasing SallySoundFX 2 quietly for now on my (free and public) Patreon.
Just in case I F'ed up something I will wait before I upload this to the Hub.
This will hopefully be the foundation for many cool Visualizations.

SallySoundFX 2.00 Release

I've tested long enough. Only visual bugs remain and performance is good. Everything "should" work. 🙏
Feedback welcome of course! And please be critical! I can take it. :cool:

Hope this does not disappoint people because it still looks 'basic'. The potential of all the crazy stuff you can do with this counts. If I would have VR hardware I'd try to make a beat saber Plugin ... that's complicated, but possible. However it would never be 100% the same, since whatever object you spawn to slash is generated from a 'live' signal. Just some thoughts ...

Todo-list: more visualizations *doh*, custom shaders, reflective materials, learn about advanced mesh rendering for even more performance

This was the first "working" version a few days ago. Later I found out it leaked memory like crazy. (fixed now ofc)
You're so sexy. I can't wait to play with it!
 
Agree with MonsterShinkai, you sexy dude!

Thank you for your work and effort in this! I'll check it out as fast as I can but like life comes I won't have time left for it until in about ~1 week. Anyway I am really excited to get my hands on!
 
this is looking great to me so far, awesome work and thank you for sharing!

I thought of that beat saber thing too and if it helps i think the easiest way to do it is to have a 'learn song' functionality. The user would just let the song play once, you record the data in a json and then it would be used in the game, like at 0:30s in the song you would start popping in atoms based on the data for 0:25s or something like that

But imo a killer application for this would be something much simpler. If you'd have a 'home' scene in vam for it where you could:
1. play songs in a folder
2. slide through "full screen" visualizations
3. have the vfx modular so that you can easily add new ones (maybe as subscenes, things you could release regularly on patreon for example)

that in itself i think would be amazing to a lot of people, probably more so than something like beat saber, and it might turn SallySFX into one of the best music visualization tools, something like MilkDrop VR maybe.

If it helps, the technical stuff and DIY will scare most people away imo, few will jump in to do visualizations themselves no matter how easy you'll make it or how much you'll explain it. So it's probably gonna be up to you to also deliver on simple application/products for it to actually get used to its potential. Myself i'd go by the pareto principle and concentrate just on the end-user application. Creator tools seem to be a bit of a dead-end for devs here from I've seen so far but that's just my take on it
 

Similar threads

Back
Top Bottom