Ninjam Evolution ( Midi for metronome )


photo: 
midi options.jpg

My modification of ninjam standalone for
having a midi instrument ( channel 10 )
in order to play the ticks of the metronome

edith : I plan to add full midi compliance ,
using a c++ lib :
https://github.com/jdkoftinoff/jdksmidi
" This is a fork of the JDKSMidi library by J.D. Koftinoff and other authors.
The main changes are the development of the class AdvancedSequencer, an all-in-one object capable to load and play
MIDI files with a single call to a class method. "

I compiled the thing , that's working .
I can imagine a new generation of server/Client for NINJAM , with a new usage/goal , like MIDI sync , midi tracks
etc ...

No, the NINJAM protocol does

No, the NINJAM protocol does not provide for externalising sync directly, as far as I could make out. A NINJAM client might do something clever this but the one in Reaper doesn't have that feature (I don't think it's possible to have a VST plug in supplying sync -- the host is in charge).

I have a digital mixer I run

I have a digital mixer I run into a laptop via USB.I also connect my iPad running Drumjam to another USB port on my laptop. I use Music IO to connect the iPad/Drumjam to Reaper/NINJAM. That seems to work except syncing. Is there a way to make the app sync to the metronome of NINJAM?

Salut DoubleBass ! ^^ Merci

Salut DoubleBass !
^^

Merci pour ce témoignage chaleureux et enthousiasmant , à l'image de la musique que tu produis .
J'ai donc fait un tour d'horizon des possibles , en testant les différentes options que j'envisage pour
Ninjam standalone

Il y a plus qu'à ...
AndyMC a annoncé aussi qu'un autre travaille sur une version VST , ce qui est de très bonne augure
pour l'avenir de Ninjam .

En ce moment , je suis pris d'inspiration par la musique d'origine malienne , à partir de laquelle je
m'octroie des parcelles de rêve et de voyage .
Difficile pour moi de résister à çà : http://www.youtube.com/watch?v=jkSsN4dFj2E
Le blues est la racine de la musique de Johnny Hallyday , la musique malienne à l'origine de la mienne.
( et de toute la musique que j'aime ah ah ah ).

Quand j'aurai fini cette parenthèse , je pense revenir
à la programmation de Ninjam .

à bien tôt !
JAAAAAAZZZZ !!
^^

Salut Ezee En plus d'avoir

Salut Ezee

En plus d'avoir un esprit super enthousiaste dès que tu es sur le serveur tu mets les mains dans le cambouis pour l'améliorer Merci pour tout le temps que tu passes pour tout ce codage , je te souhaite une réussite et moi aussi je t'encourage ... tu es le meilleur !! come on Ezee !!

On attend avec impatience toute amélioration de ce formidable jouet qu'est ninjam

Bravo !! Bravissimo !!! Great job !

Hurrah ! ^^ I had a bad

Hurrah ! ^^

I had a bad setting for frequency in Jack , 48000 set while
Cubase was set on 44000 .
So now , Ninjam standalone + Cubase + Jack run smooth
and seem to be a good solution for link live asio datas.

I will now try to create multiple in/out Jack ports , to test
the multi channels routing between Ninjam and Cubase/Reaper .

That is a very good news for me , since i know NOW that my previous work with midi sync will end up with audio sync too ! ( for the old standalone i mean )

Yeaaaah .
^^

( I have a big brainstorming about the better road map ,
considering the standalone version caps and the new dependencies, and an hypothetical VST version , where audio and midi sync are established by the host )

eh eh thank you guys ! That

eh eh thank you guys !
That is good for my motivation level .
Cool.

So , i have installed JACK for windows , and tried to
configure it quickly , but without success . I need to learn
how to use that system , and try it before to decide to integrate JACK in my workflow .

If some of you have any experience with ninjam standalone
and Jack server ports routings , please let me know .

Right now , Jack could be the solution to feed ninjam standalone with the Asio outs of any DAW .

I will start to code the midi sync in ninjam this week.

EDITH :
I 've just used cubase+ninjam+Jack , but the sound is not good ( like a vst stress) . I will try to change the
default settings in Jack PortAudio server to point to my M-AUDIO asio card . I hope i will find a way to use it , for
the future of the devs .

THE FINAL USAGE COULD BE :

NINJAM STANDALONE+MIDITOYOKE+JACK= Any DAW in sync .

Great work!! :D

Great work!! :D

Wow Ezee !

Wow Ezee !

Hi Deeds , thank you . " Mac

Hi Deeds , thank you .

" Mac version? "

I don't have a mac , so i can't compile and test my work under this environment . But the libraries that i use (PortMidi for now ) are compatibles . So when i'll release the source code , i guess someone will port it on Mac .

Actually , i only have console programs that help me to debug the midi process ,and i don't have started to code the old ninjam standalone with that feature .

The problem now is that even if i have created a midi sync to Reaper , i don't have access to individual tracks or even Asio audio buffers from the slave (reaper,cubase etc ...) , as each application locks his Asio driver .

I have tried to download the REWIRE SDK wich seem to
be adequate for that project , but i must declare an enterprise and have a web site to receive their Source Pack ... I'll wait for that option ^^ .

The other option is Jack :

" Regardless, if you use Windows, JACK's JackRouter ASIO will recognize input from any audio program that uses ASIO — including Image-Line FL Studio and Cockos REAPER — while JACK for OS X will recognize input from audio programs that use Core Audio. In both cases you may experience higher latency than with JACK-native applications.

Read more: http://www.ehow.com/info_12195479_jack-audio-connection-kit-tutorial.htm... "

With that perspective , the standalone version of ninjam
could gain access to the features of ReaNinjam , like routing the channels of the slave daw .
Plus the new Midi features that will be part of that " Ninjam Evolution " software .

Nice one Ezee. Mac

Nice one Ezee.

Mac version?

X
Deeds.

Thank you ! ^^ Reaper is

Thank you ! ^^

Reaper is working ok now , and i have tested the code for Cubase 5 , but he is not responding to Midi SOng positions messages and midi clock ( based on tempo), but rather to Midi time code ( real time frames).

Jon was right with his intuition about the simplest way of doing with Reaper : Midi clock+song pos msg are working well , even in a loop . And simple to code .

But Cubase is responding ok to Midi Time code .
( who necessit more work )
I will code the two , to give the choice to the final user .

Good work, keep at it !

Good work, keep at it !

I got it ! Reaper is looping

I got it !
Reaper is looping correctly now , slaved by my code.

I tried several times to correct my code because the loop
msg that i sent to reaper (with MIDI SONGPOS ) seemed to cause a freeze at the 1rst bar .

But it was in fact reaper that was waiting 1000ms for sync.
The solution was to change 1000 by 0 ( syncronise by seeking ahead option ) in the " External timecode syncronisation " panel of reaper ( right click on the "play" button to open ) .

Good stuff , that i could code in the reaNinjam if i had the source code . For now , i will inject that functionality in the old ninjam standalone for testing purpose .

To be continued ...

Hi . After hours of tests ,

Hi .
After hours of tests , i realized how tricky is the sync beetween Ninjam and a DAW ( reaper in my tests ) .

I am able to launch Reaper with 2 protocols by code :
_ MIDI BEAT CLOCK ( tempo dependent )
_ Midi Time Code ( frame dependent )
The tempo of reaper must be set BEFORE to the corresponding value in ninjam .

That seem perfect , but ... what about the looping mode ?
Aie aie aie ...

Much more complicated , and i'm working on that actually .
The key is to mix protocols :

( from protools engineers )

The exact MIDI event sequence at the loop point is as follows:
"Stop"
"Song Position Pointer" (pointing to the loop start location)
"Continue"

So i will have to mix Time and SongPos midi messages in order to make the slaved Daw loop properly .
Basically , in a 16 bpi session , my program will setup the
midi messages loop to that value. ( to the daw ).

My metronome will benefit from that research ( the beats
were updated by ninjam client BPI refresh , i want to be
more precise and work with timestamped msg from server ).

Voilà !
^^

Hi . I 've just found a

Hi .
I 've just found a working c++( c in fact ) code to launch reaper from .
(PortMidi lib->Midiclock sample )
Nice , because i will be able to implement this fonction soon in ninjam client . ( to syncronise the start of the sequencer with ninjam )

Good jams to all .
Hey , i've never tested the code feature here ...
This is the piece of code i will use :

/* timer_poll -- the timer callback function */
/*
* All MIDI sends take place here
*/
void timer_poll(PtTimestamp timestamp, void *userData)
{
static int callback_owns_portmidi = false;
static PmTimestamp clock_start_time = 0;
static double next_clock_time = 0;
/* SMPTE time */
static int frames = 0;
static int seconds = 0;
static int minutes = 0;
static int hours = 0;
static int mtc_count = 0; /* where are we in quarter frame sequence? */
static int smpte_start_time = 0;
static double next_smpte_time = 0;
#define QUARTER_FRAME_PERIOD (1.0 / 120.0) /* 30fps, 1/4 frame */

if (callback_owns_portmidi && !active) {
/* main is requesting (by setting active to false) that we shut down */
callback_owns_portmidi = false;
return;
}
if (!active) return; /* main still getting ready or it's closing down */
callback_owns_portmidi = true; /* main is ready, we have portmidi */
if (send_start_stop) {
if (clock_running) {
Pm_WriteShort(midi, 0, MIDI_STOP);
} else {
Pm_WriteShort(midi, 0, MIDI_START);
clock_start_time = timestamp;
next_clock_time = TEMPO_TO_CLOCK / tempo;
}
clock_running = !clock_running;
send_start_stop = false; /* until main sets it again */
/* note that there's a slight race condition here: main could
set send_start_stop asynchronously, but we assume user is
typing slower than the clock rate */
}
if (clock_running) {
if ((timestamp - clock_start_time) > next_clock_time) {
Pm_WriteShort(midi, 0, MIDI_TIME_CLOCK);
next_clock_time += TEMPO_TO_CLOCK / tempo;
}
}
if (time_code_running) {
int data = 0; // initialization avoids compiler warning
if ((timestamp - smpte_start_time) < next_smpte_time)
return;
switch (mtc_count) {
case 0: /* frames low nibble */
data = frames;
break;
case 1: /* frames high nibble */
data = frames >> 4;
break;
case 2: /* frames seconds low nibble */
data = seconds;
break;
case 3: /* frames seconds high nibble */
data = seconds >> 4;
break;
case 4: /* frames minutes low nibble */
data = minutes;
break;
case 5: /* frames minutes high nibble */
data = minutes >> 4;
break;
case 6: /* hours low nibble */
data = hours;
break;
case 7: /* hours high nibble */
data = hours >> 4;
break;
}
data &= 0xF; /* take only 4 bits */
Pm_WriteShort(midi, 0,
Pm_Message(MIDI_Q_FRAME, (mtc_count << 4) + data, 0));
mtc_count = (mtc_count + 1) & 7; /* wrap around */
if (mtc_count == 0) { /* update time by two frames */
frames += 2;
if (frames >= 30) {
frames = 0;
seconds++;
if (seconds >= 60) {
seconds = 0;
minutes++;
if (minutes >= 60) {
minutes = 0;
hours++;
/* just let hours wrap if it gets that far */
}
}
}
}
next_smpte_time += QUARTER_FRAME_PERIOD;
} else { /* time_code_running is false */
smpte_start_time = timestamp;
/* so that when it finally starts, we'll be in sync */
}
}

I have downloaded the Win32

I have downloaded the Win32 device driver Devkit .
Later i will try to make my own virtual midi ports service.

But now i am deepest in the code of the midi implementation of ninjam , beginning with the MidiTimeCode to launch an external DAW and ending
with a midi sequencer embded in the client ( his first job will be to capture bpi bars to create midi loops from live ).

The server ninjamevo will be able to generate MTC as master , and syncronise the launch of a slave Daw ( Reaper , cubase , etc ... who support MTC sync ).

My source ( very good ) is :

Maximum MIDI
Music Applications in C++
http://www.manning.com/messick/

I was talking about last

I was talking about last minuite surprises ...

MIDI ROUTING CONSTRAINTS :

It's a fact , i can't open a midi in/out port from my application if it's already opened in an external DAW .
The solution is :

A_ to use virtual midi ports to route pseudo devices in/out to/from an external application ( ninjamstandalone+DAW )

B_to be a vst and inherit the midi ports of the host ( ninjamvst+DAW )

Existing A solutions are :

MIDI Yoke, Maple, LoopBe1 ,Sony Virtual MIDI Router, loopMIDI.

I already know Midi yoke , i will test also Sony Virtual MIDI Router that seems to be free .

The solution B is up to me ... ah ah ah

EDITH
***********************************************

YEAH ! After the installation of midiyoke , the test was great!

Step 1 : Choose midiYoke port out in ninjamEvo
Step 2 : Choose MidiYoke port in in Reaper
Step 3 : Press " TEST "

KABOUM ! You hear the metronome with the sound of EDRUM or vsti . In my case , i heard the drum bass selected in ninjam with the "Tama Starclassic " sound
of Addictive drum .

Very cool !

Now i plan to make the server able to send a drum midi
file to the clients that ask for it . As Enet permit to create channels in one UDP connection , i could use one channel for send midi drum file and the other(s) for client
midi performance and MTC messages .

One exemple of application :

EDrummers could record their own bars in a midi file , and send them to the server that will be able to broadcast'n loop the file in a ninjam session .

EDITH 2
***********************************************

Downloaded Midi-ox , a tool that allow to send/inspect all
kind of midi message .

SUCCESS FOR RUNNING REAPER FROM A MTC GENERATED
BY SOFTWARE .

That is a very good news for me , that will allow to start
the sequencer of reaper from ninjamEvo standalone ( if u use a midi router like midiYoke ) .
Should work with other daws that can be sync by MTC

" got it. I think it will

" got it. I think it will work fine that way. "

I hope so . But there are always last time surprises ...

" I was just making things too complicated "

Not at all .
The background of midi programming is complicated , you
just try like i do to understand how it works and which way
to choose . It's cool to have your opinion .

" thx. "

You are welcome !

got it. I think it will work

got it. I think it will work fine that way. I was just making things too complicated :DDD

thx.

" What do you think? " I

" What do you think? "

I think you are of good help .
;)


" So in other words, MTC was designed with one MASTER and many SLAVE devices in mind. "

Sure .
My plan is to use MTC as a protocol for sync the softwares with potential DAW ( each local to a client )
or hardware ( your idea of testing hardware sync is good man !) .

Look that design of multi slaves :

Take the case where you join a session and want to play
a drum sequence in reaper for exemple .
The server will send you a MTC at the connection time , after the acceptance of the license .

The MTC message will contain the time where to start the sequencer for the caller ( unique to each ninjam client instance)
So each ninjam client could have his own DAW slaved ,
starting his loop at the first beat of audio record in ninjam .
After the first wait interval ( you receive audio data from server+MTC for start the sequencer ), your sequencer should start syncronised .

THAT IS FOR SYNC START THE SEQUENCERS ONLY
( i will explore the other caps later )

------------------------------------------------------------------

" Well, the question now is obvious. Reaper don't sync to midi beat clock, but to MTC...:(((( "

I think that we are talking about protocols , that all have
specific regions of interest .

I am ok for implement any kind of protocols , as plugin
for example . I'm working on that , to give the choice of
setup . ( Like in 3D domain you may choose DirectX or OpenGl ) .

from http://tweakheadz.com/sync-mmc-mtc-smpte/ :


"Which to use? The one that works. Not all of them will. But generally, syncing sequencers and multi-track recorders you use MTC. Controlling synths you use MIDI clocks. Controlling other sequencers you use whichever works. Many drum machines, for example, won't accept SMPTE. Many multi track recorders will not accept MIDI clocks. Some devices won't accept MMC at all. Others will allow themselves to be an MMC master but not a slave. Usually, there is one combination that will work, but don't bank on it. Some devices do not work at all. Always check that out before you by gear that needs to synchronize."

The server could answer to a client witch is the desired protocol/msg type , that is " easy " to implement .

But hey , i like also to keep the things simpler ...
Thank you for that idea Jon !

sounds cool. I have an

sounds cool.

I have an observation that can be worth discussing.

According to wikipedia the MTC specification uses SMPTE as a timecode

http://en.wikipedia.org/wiki/MIDI_timecode
http://en.wikipedia.org/wiki/SMPTE_time_code

"MTC allows the synchronisation of a sequencer or DAW with other devices that can synchronise to MTC or for these devices to 'slave' to a tape machine that is striped with SMPTE"

So in other words, MTC was designed with one MASTER and many SLAVE devices in mind. Ninjam is more decentralized and I don't see how you will fit a master/slave architecture into a p2p type design. Let me give you an example:

Once a jam has started, the time will be fixed for all the clients in a linear format like this:

HH:MM:SS:FF

So for example I join a server. the smpte will start at 00:00:00:00 and counting....

Then lets say you join at 00:00:30:15. In that case, your client will have to read 00:00:30:15. If I leave and someone else joins at...say......00:00:46:20 then his client will read that and so on......I see a problem of compatibility issue with this in order to use old ninjam or reaper.

But I can be wrong. donno :/

So maybe you need something that is not linear and much more basic. Check out midi clock.

http://en.wikipedia.org/wiki/MIDI_beat_clock

"Unlike MIDI timecode, the MIDI beat clock is tempo-dependent. Clock events are sent at a rate of 24 ppqn (pulses per quarter note). Those pulses are used to maintain a synchronized tempo for synthesizers that have BPM-dependent voices and also for arpeggiator synchronization. It does not transmit any location information (bar number or time code) and so must be used in conjunction with a positional reference (such as timecode) for complete sync."

I think fits better the ninjam architecture.

I think once you join a server the only information that is shared is BPM, BPI and some kind of song pointer information. This is very similar to Midi Beat clock, but MTC does not use this kind of information (I think)

Well, the question now is obvious. Reaper don't sync to midi beat clock, but to MTC...:((((

Hope I can help you find the right path in order to keep compatibility. Maybe the MTC is the long term solution. I just saw this could be a problem.

What do you think?

Great ! Deal ! Yes , we

Great !
Deal !
Yes , we gonna do that .

I have started to code a client/server with Enet ( UDP )
that will mimic ninjam TCP but with Midi messages .
The client will ask the server for a connection , and the
server will reply with a MTC message for the start of the song .

The client code is actually able to open MidiPorts ,
catch live play , put the messages in a queue for local play
and server feed .

The server code is only running Enet for now , i will code it tonight .

At last , the client will be a plugin (dll ) for ninjam ,
and the ninjam server updated with Udp capacities .

Good to know your gear Jon , i have an old roland too , that i will use for my tests .

EDIT : I've found a very good article with subject : "A cross-platform plugin framework for C/C++ "
http://www.drdobbs.com/cpp/building-your-own-plugin-framework-part/20420...

I try to create a ninjam SDK with a new plugin architecture.
With a framework that create a default ninjam project .
That could help for future devs

If you need some help with

If you need some help with the beta-testing I have some external gear that can sync to MTC (i.e. korg D3200) and it is rock solid when used with reaper and cubase, so It may be a good feedback source going outside the software dimension through real midi cables.

Let me know.

;)

Thank you jon ! Today i've

Thank you jon !

Today i've seen some limitations and contraints with the
NANA gui lib , and reorient myself to wxWidgets ( http://sourceforge.net/projects/wxwindows/?source=dlp )
that have native support for html support too ( to follow the idea of
a mini browser in Ninjam ) .

EDITH : I spent hours to search how to syncronise a DAW
( reaper , cubase ) with ninjam . That is a complex challenge .

Reaper , for example , can be slaved with a lot of different parameters ( right click in the play button for see the options ) . MTC ( midi time code ) is probably the
form that i will use :


" The final frontier: sample-accurate timecode
So, with a proper word clock setup in an all-digital system, it's easy to achieve perfect, sample-accurate continuous synchronization. The accuracy of the start point, on the other hand, depends upon the type of timecode being used.
MTC, for instance, offers resolution of 1/4 frame at 30 frames per second, or about 1/120 of a second - about 400 samples at 48kHz. SMPTE may offer greater resolution in some cases. However, if you're using a digital audio program, chances are that SMPTE must be converted to MTC before reaching the software - which means that you're still operating at MTC resolution. "

src :http://www.danphillips.com/articles/features/get_in_sync.htm

cool stuff ezee. And thank

cool stuff ezee. And thank you for spending your time upgrading the client.

Thank you AndyMc . Yeah ,

Thank you AndyMc .

Yeah , program the midi interface is very exiting .
I've found a lot of fantastic libs from universities or engineers , that are old ( from the very beginning of midi )
and a little deprecated ( because of the old midi format ).

But i guess that when i will be able to handle type 0 and type 1 midi file , a lot of cool stuff could be done .

Like for example , having the score of the drummer or pianist at the end of the session . Cool no ?

It's more easy to analyse packets of midi notes than audio buffer , as we know exactly the notes played and
their duration .
So when 3 notes will be played in a chord , my job could
be to find the ROOT,3,5 and dispatch and display the info
in all clients . Something like that.

Your idea with the flashing chords linked with the metronome is a great idea Andy . If i make it , u will be
credited for the idea . ^^

Ok , i return to my tests ( actually with the new Gui and
nana ) . Thank you all !

Edith : For SYSEX msg , the architecture of Ninjam with
the bpi delay before connection is a very good thing :

" ... Sequencer users sometimes reserve the first bar of a song for transferring sysex messages (such as patch loads), actually starting the song at the beginning of the second bar. " src : http://electronicmusic.wikia.com/wiki/Sysex

Yes, we don't want someone

Yes, we don't want someone telling you not too, ya doing a fine job.
Yes I'd suggest what you say new message pump, something extra that will be ignored by older ninjams but seen by newer version of server you adjust.
This way there should be no compatible problems, people can use old or your version.

We've asked and asked and asked for years and most times answers I have received has been, the open source is their for anyone who wants to do it, and you are so keep doing it. :)

" et si tu inserais un midi

" et si tu inserais un midi time code dans reaper pour les " VSTi drums " afin qu ils soit synchro sans avoir a batailler ; se serait bien . "

J'aimerai bien aussi .
L'insérer sera difficile , l'envoyer plus probable .
J'ai commencé à étudier la faisabilité , je rassemble de la doc etc ...

A+

edith :

" se doit etre possible a faire , bon courrage !"

Oui , possible avec sysex ( systeme exclusif ) :
http://electronicmusic.wikia.com/wiki/MIDI_machine_control
Merci !

Phil D = Cry & cry =

Phil D = Cry & cry = phil D

salut ,
si tu inserais un midi time code dans reaper pour les " VSTi drums " afin qu ils soit synchro sans avoir a batailler , se serait bien . c est a dire : quant j appuis sur start de ma drums , qu elle soit calée direct sur le bpm de reaninjam
et puis tiens ,quant on change le bpm sur les serveur il serait bien aussi que reaper actualise son bpm sur le meme tempo , se serait une bonne chose , je ne sais pas comment tricoter cela mais se doit etre possible a faire , bon courrage !

" Bonne chance dans votre

" Bonne chance dans votre entreprise. : ) "

Thank you !

I like the idea of ninjam , and to program in that context is
pleasant . I will learn a lot of techniks too , like core midi ,network protocols ( Ninjam use TCP for audio and chat, UDP could be my initiative for MIDI channels ... http://enet.bespin.org/ ).

I've started a private project in gitLab , that i will use for my first experimentations ( and fails *... ah ah ah ) .
When ready , i will create a public project , with sources and binary downloads .

*
I have made 2 tests in different PC today, the midi part was ok but the audio had problems ( but work fine in my pc ). Perhaps a bad compilation of " njasiodrv " , that is linked with the project .
The two Pcs had Asio4All .

Here is that first debug version for windows 32bits :
( compiled/linked in debug mode with visualStudio2008 on winXP SP2 )
https://www.dropbox.com/s/tc7jujsaphonwdi/ninjamEvo-debugV01.rar

Try it if you want a preview , and your reports could help me also .
Thank you for your comment !

They stopped updating Ninjam

They stopped updating Ninjam in the now distant 2005. It´s time for someone perform the next step.
The only input that can be used is the mic/line in. So, Ezee, it´s up to you to solve problems.
Bonne chance dans votre entreprise. : )

"Never give up" - Buddy Guy : )

Hi ! Good questions

Hi !
Good questions .

------------------------------------------------------------------
" 1 Question: Would you program this if it was included in reaper? "

Yes , possibly .
That midi metronome is very easy to implement , i don't understand why the cockos's programmers don't have made it already . ( some people asked for that for years... )

Perhaps because Ninjam is from the beginning designed for real instruments and audio streaming . ( the design of
the core ninjam process is still revolutionnary )

The core of ninjam ( network+time related musical Intervals ) is very specific and unique . I won't touch to
that part .
( except for the problem exposed by ANdyMC with the
wrong number of stereo channels , i will check that too )

What i want to do , is to plug a new graphical interface ,
that will deal with the core ninjam , including my new
fonctionalities . That should preserve the old ninjam's native code that is running well .

Lastly , if i decide to send MIDI/SYSEX messages between CLIENT/SERVER , i will have to make some choices ( tweak or create new message pump ).
But the key is to stay compatible ( client side ) to all
kind of servers .

My dream is to achieve a standalone Host for vst ,
similar to a DAW . ( without editing audio or midi )

The design that i can imagine is founded on that :

_Client ask the server , and it functionalities will adapt .
If the server is MIDI COMPATIBLE , the menu MIDI will
be accessible .
If not , the menu will be grayed .
Etc ... Etc ...

So that actual admins of traditionals servers will not loose their clients because of the evolution .

----------------------------------------------------------------

" Question: Should we ask cockos to program this for closed source plugin?

From my side , i am waiting for a light/medium scale reports of beta testers before communicate with the forums of cockos .

The first release of " ninjam evolution " will be for win32bits first , then win64 ( or at the same time, that depends on how easy will be the porting ).

I can't program for other Os than Windows , but i will
try to use some libs that are compatible in other systems . I guess that if the result is good , some others will try
to make the port .

When a beta version will be ready , i will open a thread
here for invite ninboters to the test .
Then we'll see ?

It is cool Ezee! But it is

It is cool Ezee!
But it is takes a lot of time hard working.
Therefore
1 Question: Would you program this if it was included in reaper?
2 Question: Should we ask cockos to program this for closed source plugin?

ah ah ah ! Thank you Theo_h

ah ah ah !
Thank you Theo_h !

Yeah , for playing i use also reaNinjam in Reaper .
But it's a closed source plugin VST , and i can't do anything on it .

So my purpose is to bring the old ninjam standalone to the
next level :

_ Modern interface
_ Midi in and out ( with sysex commands )
_ metronome / Beat box ( powered by Midi channel 10 )
_ VST Host ( the ultimate goal )
_ Web interface ( mini internal browser )
_ Sequencer/Post prod ( work offline with the materials recorded in the previous session )

Thank you for you interest for my cheese cave !
^^

Nice idea! though i don't

Nice idea!
though i don't use standalone version,
but i found metronome click not comfortable.
Good luck, and it smells like good cheese ha-ha

"I was going to suggest

"I was going to suggest further expanding on the gm midi ch 10 with idea of playing drums via a midi file."

That sound great .
I will dig the idea .

And , i had some meditation about Metronome as local Drum :

_ A programmable beat box inside ( you can give to each beat and/or afterbeat a sound ) .
Predefined paterns also ( allow different styles in a list ).
Routing to a real machine ?
Load Midi drum files ?

_ Same drum program for all via the " !voting drumstyle" command chat send to server ?

I've finished to code the MidiOptions , the configuration
is saved to disk and is loaded automatically when the
MidiOption menu is clicked .
(this check will probably be mooved on application launch , some little tweaks again ... ).

Edith : In // , i'm still searching a good sdk for the new UI . I have found " NANA " , that is looking promising :
" 1. Introduce to Nana
1.1 Nana is a C++ framework provides GUI, threads and filesystem. It provides many of the basic classes and functions for the cross-platform programming. "
screenshot:
http://nanapro.sourceforge.net/help/gui/images/effects_bground_show.png

NANA site : http://nanapro.sourceforge.net/help/index.htm

That's a good plan, I was

That's a good plan, I was going to suggest further expanding on the gm midi ch 10 with idea of playing drums via a midi file.
This lib would possibly allow for this. I think Tom experimented with midi drum on ninbot.
I don't think other midi parts snould be included, but drums for most jams without a drummer would be a good addition.

ah ah ah ! Yeah man . ( i

ah ah ah !
Yeah man .
( i can't invest money, so i invest time ... )

This little project is funny , i've just heard the triangle open
sound as metronome beat , that is sweet for the ears with the headphone ) .
Hey !! I forgot to code the volume of the midi channel .
one more job in my todo list ...

I am finalizing the midiconfig.cfg procedure .
To be quick , i just save a raw struct in binary format .
yeah , that won't be human readable . ( that's a lot of work to write a parser , and i don't want to use the ninjam's WDL in this case - i want to break free ! ah ah ah ) .

Soon come.

Way to take the initiative

Way to take the initiative man!

Thank you Unfretted ! The

Thank you Unfretted !

The release of that MetroMidi version shouldn't be too long.
The program is actually working , but it don't remember the
midi config after closing .

So basically , i just need to save the config in a file .
I could use "ninjam.ini" , but my last choice is to separate
my adds and keep the old ninjam config "natural" .

Ah , and i must create something to replace the old bpi bar.
The ninjam interface use GDI calls to refresh his elements, and that is slow for real time application . ( but usual for a desktop program)

OpenGl or DDraw could be a way to use the graphic card to render nice animations at high speed .
For a modern interface i mean .

SOme reports problems with the bpi bar , when using win7 or other 64 bits os .

The reason could be :
"
Windows 7 includes GDI hardware acceleration for blitting operations. This improves GDI performance using new features in the Windows Display Driver Model v1.1. This allows the DWM engine to use local video memory for compositing, thereby reducing system memory footprint and increasing the performance of graphics operations. Most primitive GDI operations are still not hardware-accelerated, unlike Direct2D. As of November 2009, both ATI and Nvidia have released WDDM v1.1 compatible video drivers. "

source :GDI :http://en.wikipedia.org/wiki/Graphics_Device_Interface

So check your video drivers guys ?

Great ! Hope it works out

Great ! Hope it works out ok.

Sure ! I am in debug mode

Sure !
I am in debug mode actually , and for win32 only .
I have to finish the save MidiOptions to file , and some little things .

I will open a site dedicated to that prod , that will certainly
be " Ninjam Evolution " .

I will also need beta testers before doing any public release .

Thank you for your comment !
( the idea started here : http://ninbot.com/topic/alternative-forms-metronome-feedback#comment-682... )

This is great! Will it be

This is great! Will it be available for download?



    © 2013 ninbot.com All rights reserved