![]() |
![]() ![]() ![]() ![]() ![]()
|
next newest topic | next oldest topic |
Author | Topic: Why out of Real Time? | |
garth paine Member |
![]() ![]() ![]() Hi All, I was playing with the example sound 'Star Atmos' today, it uses mist of 2 processors to run. I then added some 'smoothed' objects onto the 'intervals' in the ScaleVocoder, and then it was out of Real Time processing even though I have 5 processors free at that stage - how can that be?? It was out of processing on processors 8 and 6 and about 80% on 7. Why does it not just use other processors - there are no disk things going on, the sample is in RAM. I tried it on disk and it made no difference. I just don't understand why this sound wont use the free processing power??? IP: Logged | |
Bill Meadows Member |
![]() ![]() ![]() I've seen unusual scheduling like this on other Sounds, as well. It seemed to get worse when I changed from 5.16 to 5.18. It is frustrating to have a couple of DSPs max-ed out while others sit idle. IP: Logged | |
SSC Administrator |
![]() ![]() ![]() Please send us the Sound you are having trouble with. We will try to take a look at it when we get back from the demo trip. IP: Logged | |
photonal Member |
![]() ![]() ![]()
Andrew IP: Logged | |
garth paine Member |
![]() ![]() ![]()
I have included the samples that are used so you can test it just as I can. Please let me know ASAP if I can make this work, as I really need it to be happening on Monday.....
IP: Logged | |
garth paine Member |
![]() ![]() ![]()
IP: Logged | |
SSC Administrator |
![]() ![]() ![]() Garth -- We are downloading your timeline and sound file right now and we will try to look at it but it is rather difficult for us right now as we are on a 10 day demo tour. Before looking at the Sound, one suggestion that we can make is that you could try downsampling some of your Sound controls (the ones pasted into text fields). For example, if you have a Sound that controls the Frquency of a lowpass filter, try using something like (aSound L: 5) * 1000 hz + 100 hz (assuming your expression is something like this to begin with). This will cause the cutoff frequency to be evaluated at 1/5 of the rate it was before and may reduce the amount of computation needed by the expression. IP: Logged | |
garth paine Member |
![]() ![]() ![]() OK, shall downsample some of the control signals, I only give them datqa every 30ms anyway, so I dont need them polling for data more than that. If you look at the timeline, I found that I can't make Angle a live control on Channel 2 - it just blows up realtime headroom. That means I can't move that cahhel around the space. I can't see whay this would be. I also found that adding one 'smoothed' was enough to trip over the limit, and prior to that I have more than a whole processor free. It seems strange that such small changes force it over the limit. I have reduced some other characteristics of some of the sounds to make them less processor intensive. My main concern now would be to get track#2 angle variable live. BTW, I unpluged the Capy and all work fine, so thats great - thanks for that. Also, have been running the timeline for several days to test stability - absolutelly rock solid - now can't say that of any other software synthesis platform I have used IP: Logged | |
garth paine Member |
![]() ![]() ![]() Sorry also meant to ask - is there a way to feed script driven changes to the 'Angle' and 'Level' parameters in a timeline track. For instance I might trigger a script with a key down and feed it 2 variables as controllers ie. depth of curve and rate of change, and then let the script drive the live variable. Otherwise I have to do it all in MAX, which is not as elegent for these things as Kyma. IP: Logged | |
David McClain Member |
![]() ![]() ![]() >It seems strange that such small changes force it over the limit. This is one of the things that makes software systems quite different from the usual world at-large. Small changes can cause catastrophic results -- witness broken code as the most common effect. I'm not just being cheeky here... I have a longstanding interest in making human to computer interaction more reliable. But software systems do not exhibit linear and continuous behavior, the way a bending beam might within its limits of plasticity. This is a topic of huge active research, and I suspect it will remain so for quite some time to come... Of course, of SSC can figure out a solution to this particular problem I'll be delighted. But I suspect the root of the problem is what's called an NP-complete problem -- akin to the knapsack problem of stuffing a number of objects into a finite container. There are few good solutions to this problem short of exhaustive search. And that grows by non-polynomial time (NP) with respect to the number of items to stuff into the sack (sack = Capybara, items = Sounds). - DM IP: Logged | |
SSC Administrator |
![]() ![]() ![]() Hi Garth! Sorry that we have not had the time to look at this when we have a Capybara set up. (Our demos last 2 hours, with 1.5 hours set up, 1 hour to tear down, and 4-6 hours driving to the next stop on the tour.) What David is saying about the knapsack problem is correct. It is similar to a chaotic system: very small changes can sometimes result in very big differences in scheduling. We will for certain be able to look at your Sounds this weekend after we have arrived back at the office and have gotten things set back up. IP: Logged | |
garth paine Member |
![]() ![]() ![]() but, I was seeing the Capy more as a Tardis than a nap-sak - seems much more fitting. So in this paradigm, the laws of gravity and many of the other laws of physics are completely revoked, and the perceived space is detirmined by the imagination of the individual - hmmmmm looks like I might be being pulled back down to earth though ![]() IP: Logged | |
Larry Simon Member |
![]() ![]() ![]() David, Enjoyed your comments on software brittleness. Being a comp sci type I was brought up on data normalization and code efficiency, and taught my share of others to think the same way. I think now the route to robustness is to copy the way nature has achieved it: use massively redundant but different competing units, with the failure of any one being of little or no consequence to the overall system. Larry IP: Logged | |
pete Member |
![]() ![]() ![]() Hi larry would this nature impersonating kyma system take a few million years between the time we press download and compile ,and the time we get sone sound out at the other end. It would be renamed download and evolve. Maybe we would need a million cappys to act as the large chemical soup to increase the odds of the correct evolution taking place. I think I must have missed the point here. IP: Logged | |
David McClain Member |
![]() ![]() ![]() Hi Larry, and Pete, I think Larry is correct, but in the case of our brains we have a few trillion competing units working in parallel. But even a max'd out Capy only holds 28 units... The neat thing about all that parallelism (in the brain) is that you don't even have to be particularly fast to accomplish what even the largest supercomputers can't yet do - in a mere glance! Still, I hope to find, or hope to help to find, a less brittle mechanism between us humans and computers. I really hope we don't have to resort to DNA based computing before we get there... I love my Capy!! - DM IP: Logged | |
Larry Simon Member |
![]() ![]() ![]() If you haven't read Kevin Kelly's "Out of Control: The New Biology of Machines, Social Systems & the Economic World" it's a bit of an eye-opener (or at least it was at the time). I read about a third, got the message and didn't finish it, but the basic idea is that really interesting emergent behavior (like that of a swarm of bees) can result from the interaction of lots of dumb but communicating components. Same sort of thinking as the MIT idea of sending thousands of little semiautonomous robots to the moon rather that a few big machines when you want to do something useful like build a base. The problem with software is that almost every bit in an executable is a single point of failure. IP: Logged | |
garth paine Member |
![]() ![]() ![]() Hi SCC Just wondering if you have had a chance to look at the issue of managing load on the processors. I am ofcourse interested re my timeline, and adding 'smoothed' to some parameters in track 2, but also the example I mentioned in my first posting that is part of the Kyma example library. Looking forward to your responces... BTW. hope the tour was fun. Drooled over the firewire interface capability, but then I would have to upgrade my G3 powerbook to a G4 (I don't have a firewire PB - model before) so that just has to wait - mushc better value for money adding processors at this stage.... IP: Logged | |
SSC Administrator |
![]() ![]() ![]() Hello Garth, I made some optimizations to the Sounds in your Gestation timeline that reduced the processor requirements from 7 down to 6. I emailed the timeline to you, so please try it out and we can continue to work on it from there. (though perhaps better to start a new thread as this one is already on fire!) The optimizations were: * FluidDrops: removed an OverlappingMixer and replaced it with its single input * Removed an internal SetDuration (of 1 day) on BabyGiggle * LowPitchDrone's spectrum has only 18 partials, so I reduced the number of Oscillators to 18 to match the spectrum (it was set at 72 oscillators before, but only 18 of them would be audible) Yes, we did have fun on the tour (thanks for asking)! (RE firewire on G3 PBs, if it has a card bus slot, there are some cardbus to firewire cards that would give you close to the same performance.) IP: Logged |
All times are CT (US) | next newest topic | next oldest topic |
![]() ![]() |
This forum is provided solely for the support and edification of the customers of Symbolic Sound Corporation.