NAV
bibtex C++ emacs-lisp Python Shell SuperCollider

Creative coding

About

My research imperatives run along the threads of trans-mediation, inter-semiotic translation, post-disciplinarity (interdisciplinarity, multidiscplinarity) and the general concerns of Human Computer Interaction (HCI).

A large part of my contemporary practices in creating audiovisual materials is through creative coding. In coding there is the possibility to use plenty of expressions or functionalities. Coding and creative writing have many similarities in the way the writer-scripter organises the text with ideas; the mind can become creative; as it is, coding helps to increase imagination. My opinion is that through the process of transmediation from text to audiovisuality, it is possible to explore other disciplines such as science and linguistics for example; the way of combining them unveils the artistic and individual cognitive style of each person.

Key words: big-data, connectivity, interdisciplinarity,

interconnection, machine learning, multidisciplinarity, real-time, sonification,streaming, visualisation, web

Interdisciplinarity

Science and art

Biology and sound/music-image

Data driven sonification visualisation.

"The L-Systems, also known as Lindenmayer Systems, are a class of algorithms for the production of structures based on recursive rewriting rules. “The L-systems are widely used in the composition of graphics mainly for the production of complex forms of plants as well as the imitation of other natural forms ”(Prusinkiewicz and Lindenmayer 1990; Rozenberg and Grzegorz 1992). The technique of the L-systems is akin to other techniques such as Fractals and Generative Grammars. The L-systems have also been used to a limited extent for the production of musical structures. The proposed thesis will attempt the direct application of the L-systems onto the digital composition of sound signal. In this case, the L-systems will be used for the development of new methods of Granular Synthesis which combine features of the technique of micro montage. There was made a depth research on Phasor, Patterns, (Dseq, Pbind) , GrainBuf, Wavelet Transform techniques. The implementation of this work was done with the software SuperCollider"(form Agiomyrgianakis V.Undergraduate thesis. 2012 "ΜΙΑ ΤΕΧΝΙΚΗ ΣΥΝΘΕΣΗΣ ΗΧΟΥ ΒΑΣΙΣΜΕΝΗ ΣΤΑ L-SYSTEMS").

This is the L-system in SuperCollider with the axiom and the production rules

The code on the right panel shows the way I mapped numbers to the symbols of an L-system.

L-systems SuperCollider

a = LSys("F", ["F" ->"F+F--/]&F+F", "F" -> "]/F+&FF+-"]);

mapping - Assosiations (relate two objects) can be created via the -> operator

~ls = [
           F -> 1,
           + -> 2,
           - -> 3,
           /“ -> 5,
           ] -> 8,
           & -> 13
           ];

Finally, I have used patterns in SuperCollider to manipulate the arguments of the synth.

Pbind Pattern

 Pbind(\ls, \instrument, \synth01,
 \volume,Pseq(~ls*0.05),
 \frequency, Pseq(~ls*440)).play;

Astronomy and sound/music-image

Magnetic Storm Sonification

10ο Πανελλήνιο Συνέδριο Ερασιτεχνικής Αστρονομίας

Σύνθεση ήχου σε πραγματικό χρόνο από δεδομένα περιγραφής μαγνητικών καταιγίδων.

Β. Αγιομυργιανάκης1 , Φ.Α. Μεταλληνού2 ,Γ. Ζάννος1

  1. Τμήμα Τεχνών Ήχου και Εικόνας, Ιόνιο Πανεπιστήμιο.
  2. Ινστιτούτο Αστρονομίας, Αστροφυσικής, Διαστημικών Εφαρμογών και Τηλεπισκόπισης, Εθνικό Αστεροσκοπείο Αθηνών.

Περίληψη

Η μετατροπή επιστημονικών δεδομένων σε ήχο και η χρήση του ήχου ως μεθόδου κατανόησης των φυσικών φαινομένων, αλλά και διδασκαλίας των φυσικών επιστημών, αποτελεί μια μέθοδο, διαθεματική και σύγχρονη. Αποτελεί δε ένα πολύ χρήσιμο εργαλείο για την εξαγωγή και δημοσίευση πρωτότυπων αποτελεσμάτων. Στην παρούσα εργασία μετατρέπουμε σε ήχο, δεδομένα που περιγράφουν μεταβολές του γεωμαγνητικού πεδίου και ειδικότερα δεδομένα που περιγράφουν την ανάπτυξη και εξέλιξη των μαγνητικών καταιγίδων στο κοντινό μας διάστημα. Στην διαστημική φυσική, οι μαγνητικές καταιγίδες καταγράφονται από επίγεια όργανα, τα μαγνητόμετρα, τα οποία μετρούν μεταβολές του γεωμαγνητικού πεδίου. Χρησιμοποιούμε δεδομένα από τα μαγνητόμετρα του Εθνικού Αστεροσκοπείου Αθηνών τα οποία συλλέγει, διαχειρίζεται και επεξεργάζεται το ερευνητικό πρόγραμμα ENIGMA (Hellenic Geomagnetic Array). Οι μαγνητικές καταιγίδες αποτελούν φαινόμενα που σχετίζονται άμεσα με την ηλιακή δραστηριότητα. Κατά την διάρκεια μιας μαγνητικής καταιγίδας, ενέργεια που φέρεται από τον ηλιακό άνεμο εισέρχεται στο μαγνητικό πεδίο της γης προκαλώντας πλήθος φαινομένων, τα οποία μπορούν να δημιουργήσουν προβλήματα στην λειτουργία των δορυφόρων, στις εργασίες των αστροναυτών στο διάστημα και άλλες ανθρώπινες δραστηριότητες στο έδαφος. Έκφανση των μαγνητικών καταιγίδων στο οπτικό παράθυρο του ηλεκτρομαγνητικού φάσματος αποτελεί το σέλας. Η ιδέα της ηχοποίησης μαγνητοσφαιρικών φαινομένων, μοιραία συνδέεται με την ελληνική σκέψη του 6ου αι. π.Χ. όπου οι πυθαγόρειοι, φιλόσοφοι, μαθηματικοί και θεωρητικοί της μουσικής θέλησαν να συνδυάσουν την Αστρονομία με τη Μουσική αναφερόμενοι στη «αρμονία των σφαιρών». Χρησιμοποίησαν τον ήχο ως μέσο για την κατανόηση και την μελέτη φυσικών φαινομένων, με χαρακτηριστικό το παράδειγμα του Αρχύτα ο οποίος συνδύασε την περιστροφή των ουρανίων σωμάτων με συχνότητες, αντιμετωπίζοντας την αρμονία των σφαιρών ως ένα πρόβλημα φυσικής. Ο δε Πλάτωνας αναφέρει την μουσική και αστρονομία ως αδελφές επιστήμες (Πολιτεία, VII, 530d).

Μέθοδος Επεξεργασίας των Δεδομένων

Τα δεδομένα που επεξεργαζόμαστε αφορούν τις μεταβολές της έντασης του γεωμαγνητικού πεδίου, όπως αυτές καταγράφονται από επίγεια όργανα, τα μαγνητόμετρα. Χρησιμοποιούμε δεδομένα από δύο αρχεία με συνολικά 80.000 μετρήσεις ανά αρχείο. Οι μετρήσεις περιγράφουν την ένταση του μαγνητικού πεδίου στις τρείς διαστάσεις (x, y, z). Τα αριθμητικά μεγέθη των διαδοχικών μετρήσεων έντασης του μαγνητικού πεδίου στις τρεις διαστάσεις του χώρου (x-y-z) προβάλλονται διαδοχικά σαν σημεία σε έναν δυνητικό τρισδιάστατο χώρο, και συνδέονται μεταξύ τους για να δημιουργήσουν ένα σχήμα που αντιπροσωπεύει την εξέλιξη του μαγνητικού φαινομένου στον χρόνο. Συγχρόνως, τα ίδια σημεία χρησιμοποιούνται σαν είσοδος σε ένα μοντέλο υδροδυναμικής ροής που προβάλλεται σαν φόντο στην εικόνα δίνοντας μιαν εντύπωση της ροής της καταιγίδας. Για την ηχοποίηση, τα δεδομένα αποτελούν παραμέτρους εισόδου σε διαδικασίες ψηφιακής σύνθεσης ήχου, σχεδιασμένες σε σειρά πειραμάτων ώστε να ανταποδίδουν την εντύπωση της αυξομείωσης της έντασης και κατεύθυνσης του μαγνητικού πεδίου. Η έρευνα σκοπεύει στην ανάπτυξη εργαλείων για την προβολή αστροφυσικών δεδομένων σε περιβάλλον εμβύθισης τύπου δυνητικής πραγματικότητας. Η διαδικασία παραγωγής ήχου σε πραγματικό χρόνο, επιδεικνύεται ζωντανά, με κώδικα που γράφεται επί τόπου για την εξερεύνηση δυνατοτήτων του συστήματος. Για να ηχοποιήσουμε τα δεδομένα, χρησιμοποιήσαμε το SuperCollider ένα εργαλείο προγραμματισμού για τη σύνθεση ήχου. Στο SuperCollider κατασκευάσαμε έναν αλγόριθμο για να συλλέξουμε τα δεδομένα και στην συνέχεια να τα μεταφράσουμε έτσι ώστε να ελέγξουμε τις παραμέτρους του συνθετητή μας. Σε αυτήν την εργασία παρουσιάζεται η μέθοδος της χαρτογράφησης παραμέτρων (Parameter Mapping) (Kramer, Gregory 1994). Το πλαίσιο για τη χαρτογράφηση παραμέτρων μαγνητικής καταιγίδας σε ήχο περιγράφεται από την ακόλουθη διαδικασία πέντε λειτουργιών: (1) τη εισαγωγή δεδομένων απο τα αρχεία, (2) την επιλογή των στηλών προς χρήση, (3) την κατασκευή του συνθετητή, (4) την παραμετροποίηση και αποστολή των ροών στις παραμέτρους του συνθετητή, (5) απλοποίηση της διαδικασίας ηχοποίησης με τη χρήση γραφιστικών διεπαφής χρήστη-υπολογιστή απο έξυπνες συσκευές. Επιπλέον, παρουσιάζουμε τα εργαλεία μας για ηχοποίηση και καλλιτεχνική παρέμβαση για παρουσιάσεις σε πραγματικό χρόνο κάνοντας χρήση γραφιστικών διεπαφών χρήστη-υπολογιστή (GUIs). Για παράδειγμα, δημιουργήσαμε λειτουργίες και κλάσεις που ενσωματώνουν τον κώδικα και με το πρωτόκολλο επικοινωνίας Open Sound Control (OSC) καταφέραμε να ελέγχουμε τη διαδικασία ηχοποίησης μέσω έξυπνων συσκευών. Η εφαρμογή που χρησιμοποιήσαμε για έξυπνες συσκευές είναι το TouchOSC.

Συμπεράσματα

Η εργασία αυτή σκοπεύει στην ανάδειξη της ηχοποίησης ως μεθόδου ανάλυσης επιστημονικών δεδομένων και ειδικότερα δεδομένων που περιγράφουν την ανάπτυξη και εξασθένιση μαγνητικών καταιγίδων στο γεωδιάστημα. Με την κατασκευή αλγορίθμου στο SuperCollider καταφέραμε να ηχοποιήσουμε τα δεδομένα και παράλληλα να εξερευνήσουμε την αισθητική δυνατότητα της ηχοποίησης δεδομένων για παραστάσεις σε πραγματικό χρόνο. Μέσω της χρήσης γραφιστικών διεπαφών χρήστη-υπολογιστή (GUIs) διανθήσαμε την ηχοποίηση σε πραγματικό χρόνο φτιάχνοντας με την δική μας ερμηνεία, μία αφήγηση. Τα αποτελέσματα της συγκεκριμένης έρευνας στοχεύουν στη βελτίωση των παρόντων, αλλά και η ανάπτυξη νέων, τεχνικών ηχοποίησης, ενισχύοντας έτσι την διεπιστημονική συνεργασία στην διάχυση της επιστήμης. Επιπλέον, τα αποτελέσματα μπορούν να χρησιμοποιηθούν στην δημόσια κατανόηση της διαστημικής φυσικής προς το ευρύ κοινό, αλλά και στην διδασκαλία της σε ειδικές ομάδες, όπως σε άτομα με προβλήματα όρασης. Αρχικά αποτελέσματα, έχουν ήδη παρουσιαστεί στο ευρύ κοινό, με το "Storm Trio" AKOYSMATA - AVARTS Festival (Μέγαρο Μουσικής, Αθήνα 21 Μαΐου 2017), στην «Βραδιά του Ερευνητή» (Ελληνικός Κόσμος, Αθήνα 29 Σεπτέμβρη 2017), όπου ο ήχος μιας μαγνητικής καταιγίδας μας θύμισε τον άνεμο, καθώς παρουσιάζει μεταβολές στην ένταση, την συχνότητα και την χροιά.

Ευχαριστίες

Ευχαριστούμε τον Δρ. Γιώργο Μπαλάση, ερευνητή του Εθνικού Αστεροσκοπείου Αθηνών για την παροχή των δεδομένων ώστε να ηχοποιήσουμε περιόδους έντονης γεωμαγνητικής δραστηριότητας, που αντιστοιχούν σε φαινόμενα μαγνητικών καταιγίδων.

Αναφορές

Hermann, Thomas, and Helge Ritter. 1999. “Listen to Your Data: Model-Based Sonification for Data Analysis.” In Advances in Intelligent Computing and Multimedia Systems. https://pub.uni-bielefeld.de/publication/2017409.

Kramer, Gregory. 1994. Auditory Display: Sonification, Audification, And Auditory Interfaces. Edited by * EDITOR. Reading, Mass: Westview Press.

Madhyastha, Tara. and Reed, Daniel. 1994. A frame-work for sonification design. In Auditory Display, G. Kramer (ed.), Addison-Wesley.

Roads, Curtis. 2001. Microsound. Cambridge, Mass.: MIT Press.

Scaletti, Carla. 1994. Sound synthesis algorithms for auditory data representations. In Auditory Display. G. Kramer (ed.), Addison-Wesley.

Wanda L. Diaz-Merced, Robert M. Candey, NanStephen Brewster et al. 2011. Sonification of Astronomical Data. New Horizons in Time-Domain Astronomy, Proceedings IAU Symposium.

// =====================================================================
// SuperCollider Workspace
// =====================================================================
// Data driven sonification of Magnetic storm

//load data files

~files = "~/Documents/data/MagneticStorm12-15\ March2016_NOA\'s\
magnetometer/*.dat.txt".pathMatch;

//:load and collect data
	"load data".postln;
	(
~load = { | path |
	var data;
	// select only these rows which contain 7 columns exactly:
	data = CSVFileReader.read(path) select: { | row, column |
		row.size == 7;
	};
// collect 2 to 4 rows from the list and replace symbols such as "+"
	data.flop[2..4].flop collect: { | row |
		row collect: { | string |
			string.replace("+", "").interpret;
		}
	};
};
	)
//: Create Synths

"create and add synthdef 1".postln;

(
// first load the sound sample in the buffer

	~buffers = Buffer.read(s,"~/Documents/sounds/PianoSample01.wav");

// Create synthdef granulator

	SynthDef(\granulatorAn, {| gate = 1, freq = 1000, freq2
= 5000, freqblow = 10, rq = 0.25,
modfreq = 122, ind = 0.5, amp = 0.5, bufnum, envbuf, trig = 1,
dur = 0.01, rate = 1, pos = 0.3, pan = 0, vol = 0.1|

var env, modulator, source;

		modulator= SinOsc.kr(modfreq,0,10*modfreq*ind, freq);

		env = EnvGen.kr(Env.perc, gate, doneAction: 2);

source = GrainBuf.ar(2, Dust.kr(trig), dur*LFNoise1.ar(1).range(1, 3),
bufnum, BufRateScale.kr(bufnum)*(modulator/440)*rate, pos, 2,
LFNoise1.ar(pan).range(-1, 1), envbuf)*env;
source = LPF.ar(source, freq2);
Out.ar(0, source*vol)!2
}).add;



// Create synthdef klank

SynthDef(\klank01, {|out = 0, gate = 1, vol = 0.0001, freq = 440, freq3 = 999,
freq2 = 444  decay = 0.02, cutoff = 2000, amp = 0.01, trig = 1,
freqs (#[100, 200, 300, 600]),
amps (#[0.3, 0.2, 1, 0.05]),
rings (#[1, 0.1, 0.5, 2]), pan = 0|

 var env, source, filter;
         env = EnvGen.kr(Env([0, 0.8, 0], [2, 2]), gate, doneAction: 2);

source = DynKlank.ar([freqs*freq, amps, rings],
Dust.ar(trig)+WhiteNoise.ar(amp)+SinOsc.ar(SinOsc.ar(freq*2, freq2, freq3), 0,
0.3)*SinOsc.ar(SinOsc.ar(freq*2.43, freq2*2/35+12, 1.2.rand+[2000, 200.202]), 0,
0.3)*0.003);

	source = LPF.ar(source, cutoff, 0.4, amp).softclip;
	source =  LeakDC.ar(source, 0.995);

         Out.ar(out, Pan2.ar(source*env*vol, pan))
}).add;
	)

// load from the data files the first one
		(
			{
	var data;
	data = ~load.(~files.first);

			10.wait;

			"run data: storm starts".postln;

	data do: { | row |

		var addr = NetAddr("127.0.0.1", 12345);
		"TO - SYNTH".postln;

// Parameter mapping

~nodedkl = [Synth( \granulatorAn, [\bufnum, ~buffers, dur: 0.3, \trig,
row[0].abs.postln, \dur, row[0].abs.sin.postln, \pos,
row[0].abs.cos.postln, \rate, 1, \freq2, row[1].pow(2).postln, \vol, 1, \pan,
row[0].abs.sin.postln,\envbuf, -1]);,

Synth(\klank01, [\freq, row[0].abs.tan.postln, \freq2,
row[1].abs.squared.postln, \freq3, row[2].abs.squared.postln, \cutoff, 6000,
\amp, row[1].abs.tan.postln, \legato, 1, \vol, 1]);];

		0.1.wait;//100  miliseconds
	}

)

Symbolism in creative writing

PhD thesis title: Audiovisual works in response to creative writing

I found that by using symbolism and metaphors I can tell a story in my own way. The manipulation of the semiotic process, both in image and music, is crucial for the formal structure of my works and my process of translation.

It is worth noticing how in my Haiku series there are notions of visual and audio fragmentation, generative systems, chaos and complexity as well as other conceptual influences. I have been inspired by the Jungian approach to symbolic interpretation. For example, I have interrelated the symbolism of Nekyia in Homer's narrative of Odysseus' descent into Hades, with the descent into the unconsciousness as Jung analysed it (in the collection of journalistic interviews in the book C.G Jung Speaking (1987) in the case of the modern artist). In In my work "Aranea" I translated an archetypical story of pride into a video game; expressing psychological complexes, like the Icarus' complex or narcissism, in relation to the concept of hubris and nemesis as they are found in the story of Arachne.

Algorithmic composition paradigms

Frozen fragments

The procedure of precise cutting and recombining. The basis of the 1950s musique concrète.

Peripatetic Haiku

Click the link on the right panel to watch Haiku series.

For the first pieces of my portfolio I decided to use Haiku poems as a source of inspiration to create audio and visuals. My first aim was to decide how to draw audiovisual material from a text source. I chose to work with Haiku poems because of their immediacy and their richness in meaning. In my opinion, Haiku are an ideal form to express experiences with a minimum of words by means of symbolism and metaphor. As Gibbs describes in his book The poetics of mind: The empirical work in cognitive science strongly indicates that many facets of everyday thought and language are indeed metaphorical, enough so that we should recognise metaphor as a primary mode of thought. (Gibbs, 1994, p.122)

I have used diverse methods and techniques to compose audiovisual materials. One of them is algorithmic composition and more specifically chaos and self similarity. For instance, I used L-systems to generate musical phrases and melodic structures as well as other stochastic processes and sound synthesis techniques.

L-systems, in my opinion, are poetically related to Haiku because both they can provide a large output simply with the use of a few lines.

Nekyia

Click the link on the right panel to watch Nekyia.

For this part of my work I decided to experiment with a larger poetic form; so I chose Homer's The Odyssey. The book or the rhapsody I chose to audio-visually represent is the Book XI or the Rhapsody Λ and more specifically Nekyia. This section is characterised by Jung as a symbol of the descent into the unconsciousness. He used this event as the symbol of describing the psychological condition of the modern artist, especially the art of Pablo Picasso: When I say "he", I mean that personality in Picasso which suffers the underworld fate - the man in him who does not turn towards the day world, but is fatefully drawn into the dark; who follows not the accepted ideas of goodness and beauty, but the demonical attractions of ugliness and evil. (Jay, Jung, 2012, p. 54 )

I have been inspired by Jung's thought about "katabasis and katalysis and the recognition of the bipolarity in human nature as well as the necessity of conflicting pairs of opposites" (Van den Berk 2012, p.111). In my audiovisual interpretation of Nekyia I have represented pairs of opposites such as light against dark, white against black, male against female, oar against shovel.

Text Driven Creativity

For this work, I have developed an algorithm so as to map text to sound, using the SuperCollider language. In particular, I experimented with the idea of using the letters of Homer's Odyssey the Nekyia chapter. I have mapped the text to numeric values. These values could then be easily used to manipulate the parameters of the sound such as, frequency, amplitude, timbre and duration. In my effort to represent the ancient Greek and English versions of Nekyia I have built two lists of collected characters. With the resulting algorithm it is possible to use a large quantity of values which result from encoding the text serially. According to the text which is encoded the quality of the output ranges from rhythmic and melodic to chaotic. Theoretically, if we played the rhythmic patterns of a paragraph of the original text we could retrieve the rhythmical structure of the ancient Greek version of The Odyssey which, in this case, is the dactylic hexameter. To represent the rhythmic scheme of The Odyssey written in ancient Greek I have used its scansion system.

Scansion system example:

--|-uu|-uu|-uu|--|--

The Dactyl ( -uu ) is a metrical pattern known as a "foot" which comprises one long syllable followed by two short syllables.

See more about Dactylic Hexameter

Interactive environments

Click the link on the right panel to watch a screencast of the gameplay.

Aranea game play

In my portfolio work of creative responses to text, I thought it would be interesting to experiment with story-telling in videogames and to create a "video game"; so I used a well-known myth from the Greek tradition. In particular, I chose, as a central story, a part of the myth of Goddess Athena, in which we find the story of Arachne (which is the Greek word for ‘spider’), as described by Ovid in his Metamorphoses (Riley, 1893). According to Book VI of Ovid Metamorphoses, the story is about: ...the maiden Arachne who lived in Colophon, an opulent city of Lydia...Arachne, vain-glorious of her ingenuity, challenges Minerva (Athena) to take part in a contest of skill in her art. The Goddess accepts the challenge, but being enraged to see herself outdone, strikes her rival with her shuttle; upon which, Arachne, in her distress, hangs herself. Minerva, touched with compassion, transforms her into a spider. (Riley, 1893, Fable 1)

Aesthetically I have been influenced by the Limbo video game, (2010), by Playhead.

Short video series

Spooky Walk

Delphi trip

Click the link on the right panel to watch short video series.

Algorithmic Composition

Algorithm is:

Primary principles of automated information processing can already be found in the 13th century. Through the works of Charles Babbage and Ada Lovelace, The history of algorithmic composition has its beginning shortly after the turn of the first millennium with a system developed by Guido of Arezzo enabling the generation of melodic material from texts, spans over the application of algorithmic principles in the developing complex polyphony and is also found in the “composition machines” of Athanasius Kircher in the Baroque period. Furthermore, first applications of algorithms for compositional tasks can be found in the popular “musical dice game” in the 18th century.

Paradigms

Markov models are for the most part employed in the field of style imitation, but also, for example by Hiller and Xenakis, for applications of genuine composition.

Chomsky's generative grammars generation of musical structure

David Cope’s program “EMI” “EMI” generates style imitations after having analyzed a sufficient number of compositions of a particular genre.

http://www.computerhistory.org/atchm/algorithmic-music-david-cope-and-emi/

Fractals, Lindenmayer systems

Nierhaus 2009 (Algorithmic Composition),

Roads 1996 (Computer Music Tutorial)

For more info see here:

Algorithmic Composition

Code and tips

This section communicates the experience of building tools for interactive and data-driven audiovisuality using creative coding environments and techniques.

It contains some basic information and tips (installations and usage) about emacs, ipython notebook, SuperCollider, Raspberry pi, shell and G++.

Creative coding examples are enclosed in code blocks on the right panel of this page.

Programming languages:

There is also a discusion about devices and protocols which are used for the interaction and communcation between user and machine as well as between machine and machine.

Some examples are:

Usage

Organising and sharing

organising and sharing projects using tools such as mass communication techniques Git.

Shell

Installing tmux

tmux is a "terminal multiplexer", it enables a number of terminals (or windows) to be accessed and controlled from a single terminal. tmux is intended to be a simple, modern, BSD-licensed alternative to programs such as GNU screen.

This release runs on OpenBSD, FreeBSD, NetBSD, Linux, OS X and Solaris.

tmux depends on libevent 2.x. Download it from:

http://libevent.org

use this command to install it:

Choose Shell

$ sudo apt-get install libevent-dev

It also depends on ncurses, available from:

http://invisible-island.net/ncurses/

to install ncurse open shell and type:

$ sudo apt-get install libncurses5-dev libncursesw5-dev

To build and install tmux from a release tarball, use:

$ ./configure && make
$ sudo make install

tmux can use the utempter library to update utmp(5), if it is installed - run configure with --enable-utempter to enable this.

To get and build the latest tmux from version control:

	$ git clone https://github.com/tmux/tmux.git
	$ cd tmux
	$ sh autogen.sh
	$ ./configure && make

to run sh autogen.sh you need to install automake.

aclocal is part of the automake package.

$ sudo apt-get update
$ sudo apt-get upgrade
$ sudo apt-get install automake

To install tmux run the following command:

$ sudo apt-get update
$ sudo apt-get upgrade
$ sudo apt-get install tmux

To run tmux open shell and write:

tmux

see the tmux manual here tmux

and here tmux-cheatsheet

Switch to zsh

The Z shell (zsh) is a Unix shell [...]. Zsh can be thought of as an extended Bourne shell with a large number of improvements, including some features of bash, ksh, and tcsh1.

Installing zsh on Linux

Some Linux systems come preloaded with zsh. You can check if it exists as well as its version by writing zsh --version in a terminal window. In case this zsh version is ok for you, you're done now!

Determine on which Linux distribution your system is based on. See List of Linux distributions – Wikipedia for a list. Most Linux systems – including Ubuntu – are Debian-based.

Debian-based linux systems

Open a terminal window. Copy & paste the following into the terminal window and hit Return. You may be prompted to enter your password.

Choose Shell

$ sudo apt-get update
$ sudo apt-get upgrade
$ sudo apt-get install zsh

You can use zsh now.

to switch from bash to zsh open shell and run the following command:

$ chsh -s $(which zsh)

To know which shell are you using, use the following command:

$ echo $SHELL

It will return something like:

/bin/bash

or

/bin/zsh

After you know the shell, if the file .bashrc or .zshrc doesn't exist in your home directory (echo $HOME to find out), just create it.

If you are using bash, you may have a file called .bash_profile where you can put your export command instead (don't know in zsh).

Convert w4a, mp4, etc to wav

Choose Shell

install ffmeg

$ brew install ffmpeg --with-fdk-aac --with-ffplay --with-freetype --with-libass --with-libquvi --with-libvorbis --with-libvpx --with-opus --with-x265

link brew with python

brew link python

remove existed files

rm 'usr/local/Cellar/2to3'

convert .m4a to .wav

$ ffmpeg -i /Users/sounds/23.3.\ j.m4a  23.3.\ j.wav

convert all the files into directory with .wma to mp3

$ for file in *.wma; do ffmpeg -i "${file}"  -acodec libmp3lame -ab 192k "${file/.wma/.mp3}"; done

Emacs

GNU Emacs An extensible, customizable, free/libre text editor — and more. At its core is an interpreter for Emacs Lisp, a dialect of the Lisp programming language with extensions to support text editing. ...emacs

Install packages

Install org-plus-contrib

;; add this to your Emacs init file to be able to list the Org mode archives:

(require 'package)

(add-to-list 'package-archives '("org" . "http://orgmode.org/elpa/") t)

Then M-x list-packages RET will list both the latest org and
org-plus-contrib packages.

Org-mode

Create Headers

Start with an asterisk to make Headers and two asterisks for Subheaders

Example:

* Header

** Subheader

Give a title to your page using hash (#) and plus (+) symbols

Example: #+Title: Getting started with org-mode

Hide Numbers, table of contents

Example: #+Options: num:nil toc:nil

Write #+ and press Meta-<tab> to see the list of variables

Example:

#+AUTHOR: Vasilis Agiomyrgianakis

#+DATE: 120416

Bulleting-Quoting

Use hyphen to make bullets

You can include quotations in Org mode documents like this:

#+BEGIN_QUOTE

'QUATATION'

#+END_QUOTE

QUATATION

Markups

Give emphasis to your text.

Write your text inside the below symbols:

Bold, italics, verbatim, strikethrough

Linking

Press C-c C-l to link objects (files)

Example:

Link: https//:basmyr.net

Then give a name to the linked text

Description: Basmyr.net

Press C-c C-o to open the linked plain text with an external program

Basmyr.net

or a video url

Granulator

Tables

Use pipes - vertical bars to make tables

Example: Start with pipes and some text:

| some | | Data |

then hit return, pipe (vertical bar), hyphen and tab to extend the table verticaly

Press tab and the arrows keys to make arrangements on the table

Some Data
234 muons
1200 jets

Images & Graphics

Images

To insert an image with descriptions do the following:

Example-images

#+Caption: This is my image
#+Name: Fig 1
[[./images/myimage]]

Ditaa

#+BEGIN_SRC ditaa :file image/awesome.png

/_\_ _ _ _ _ __ _ __ /_\\ \ /\ / \/ _|/ \| '_ ` _ \ / _ \ / _ \ V V / /\ \ (_) | | | | | | __/ \_/ \_/\_/\_/ \___||___/\___/|_| |_| |_|\___|

#+END_SRC

C-c C-c to evaluate lisp code inside source block

Find the path of ditaa.jar in you computer through a lisp program

(expand-file-name
             "ditaa.jar"
      (file-name-as-directory
            (expand-file-name
                "scripts"
               (file-name-as-directory
                  (expand-file-name
                      "../contrib"
                     (file-name-directory (org-find-library-dir "org")))))))

Export to other formats

Pressing C-c C-e popups a buffer to export markups to HTML-PDF-etc.

Example: hit h and o if you want to export and open as html.

Export Beamer: C-c C-e l P (org-beamer-export-to-pdf)

Export PDF: C-c C-e l O

To export and open pdf make sure you have installed Mactex with extras no the basic version.

Export as LaTeX, and open PDF file.

Source Code

Create code blocks to insert your code.

Press C-c ' inside the SRC block to edit the current code block

in the mode of the language you want. For instance:

#+BEGIN_SRC emacs-lisp

write some lisp to make your configurations in org-mode

so as to see bullets (UTF-8 characters) when you editing Headers in org-mode instead of asterisks.

Then close the source block with:

#+END_SRC

Result

     (require 'org-bullets)
(add-hook 'org-mode-hook
          (lambda () (org-bullets-mode 1)))

You can customise source blocks using M-x customize-face RET face RET

Evaluate source code. Press C-c C-c inside the block and see the results.

echo "Hello $USER! Today is `date`"

LaTeX integrator

\begin{align*} q = 2 * 4 + 1 - 2 &= 7 \\ q &=7 \end{align*}

Shortcuts Tips

write down <s and press tab to open src blocks,

<q tab for Quotes,

<e tab for Examples

<c tab for Center

etc.

to cooment a lisp region select a word or a region with C-M-space and then M-; to comment

Change read-only files on emacs

M-x toggle-read-only

TO DO

type TODO to create a todo object Move the cursor one line after the TODO item and press C-c C-s =(org-schedule)= to schedule with agenda

TODO Call John

SCHEDULED: <2016-11-09 Wed>

TODO read this and that

SCHEDULED: <2016-11-10 Thu>

DONE export html minted (highlight colour - syntax source blocks)

SCHEDULED: <2016-10-12 Wed>

To open week-agenda window press C-ca To schedule a TODO item press C-c C-s Use shift-arrows to change dates

Github

Introduction to Environments (Github or Bitbucket) for organizing and sharing files-Git. Set up an account with Slack and Github.

Github

organize projects and share the individual processes using tools such as mass communication techniques Slack-Git.

Git:

GitHub Hello world

Github example

Https://guides.github.com/activities/hello-world/, Https://gist.github.com/davfre/8313299

install package magit and usage

Magit

Download Magit

Install Magit using MELPA

Dired to the folder you want to create the .git file and press

M-x magit-init and press y

Then press M-x magit-remote-add

add remote rep to master or origin

Backup rep online in GIThub

First, create an ssh key to gain access into your repositories in Github

Concatanate keys on terminal

ie. //cat id_rsa.pub | pbcopy

Press C-x m to display information about the current Git repository

Press C-x g for magit-status

s to stage files

c-c (write a comment) and then press C-c C-c to commit

The next step is to push to a remote branch on Github.

Press Shift-p p to push to a remote branch (master).

P-p to push to remote

Pull requests

If you use more than one computers for the same repo you need to pull first and then to push to remote repo.

On magit press F and p to pull to master. Then you are able to push into the location of this exact repository from your other computer. If later you want to upadate changes in your first computer you need to pull again and then push.

NOTE

This seems a little bit tricky but it might happen.

You might need to change the url of your repo in your first computer in case you didn't choose the same name for the folder where your project is located in your second computer. This happens, in case you first created the repo for example, in your laptop, in a folder named project1 and built your site i.e hugo, and then pull this repo into a folder with the name project2 in your other computer.

To change the repo address go to your .git folder inside your poroject and open config file. Then change the repo address to the new one.

i.e in config replace the old url with the new one:

[remote "origin"]

url = git@github.com:User/project1.git

url = git@github.com:User/project2.git

After you did this you will be able to push again from your first computer.

other issues

git cherry-pick -m 1 1234deadbeef1234deadbeef git rebase --continue

merge

Git failed...

Git refusing to merge unrelated histories

in magit press h to bring the popoup window and choose rebase -r and then -p

Install Dictionaries -emacs

Install aspell with brew

$ brew install aspell --with-lang-en --with-lang-el --with-lang-de

for English, Hellenic and Deutsch languages.

if you have problems installing aspell with the above code

just do only:

$ brew install aspell

Install aspell using downloaded packages aspell-0.60.6.1

go to terminal and type:

$ cd ~/aspell path
$ ./configure
$ make
$ make install

To install additional dictionary download the language you prefer from GNU_Available Aspell Dictionaries

i.e aspell-el-0.50-3 for Hellenic language (Greek)

and go to terminal and type:

$ cd ~/dictionary path
$ ./configure
$ make
$ install

Switch dictionary

To switch between dictionaries run:

M-x: ispell-change-dictionary

and write greek for Hellenic language auto-correct

Press F6 (fn-F6) to switch between dictionaries (british, greek, german)

If you want to use the english dictionary in a particular buffer instead, put the following on the first line of the buffer:

-*- ispell-dictionary: "english" -*-

Use flyspell instead of ispell

(setq ispell-list-command "--list")

text expansion

install YASnippet using MELPA

put the below into your init.el

(add-to-list 'load-path
             "~/.emacs.d/plugins/yasnippet")
(require 'yasnippet)
(yas-global-mode 1)

quit emacs and open again and type

M-x yas-new-snippet

try using the abrev key and press tab to expant your text.

Searching

Press C-s to search with I-search.

You can also try:

C-h f (or M-x describe-function) will show you the bindings for a command. C-h b (or M-x describe-bindings) will show you all bindings. C-h m (M-x describe-mode) is also handy to list bindings by mode. You might also try C-h k (M-x describe-key) to show what command is bound to a key.

See also helm swoop

There is also the projectile. To use projectile make sure you have created a .git repo into your project.

Press C-cpp to open projectile and search for projects

Export references to pdf with org-mode - bibtex

Use bibtex package for citation.

First put the bellow code in your ./emacs

;; Bibtex-latex export citation
(setq org-latex-pdf-process
      '("latexmk -pdflatex='pdflatex -interaction nonstopmode' -pdf -bibtex -f %f"))

Your next step is to create a .bib file with your citations and name it i.e: test-bib-refs

Example of bibtex style

@ARTICLE{2011ApJS..192....9T,
   author = {{Turk}, M.~J. and {Smith}, B.~D. and {Oishi}, J.~S. and {Skory}, S. and
     {Skillman}, S.~W. and {Abel}, T. and {Norman}, M.~L.},
    title = "{yt: A Multi-code Analysis Toolkit for Astrophysical Simulation Data}",
  journal = {The Astrophysical Journal Supplement Series},
archivePrefix = "arXiv",
   eprint = {1011.3514},
 primaryClass = "astro-ph.IM",
 keywords = {cosmology: theory, methods: data analysis, methods: numerical },
     year = 2011,
    month = jan,
   volume = 192,
      eid = {9},
    pages = {9},
      doi = {10.1088/0067-0049/192/1/9},
   adsurl = {http://adsabs.harvard.edu/abs/2011ApJS..192....9T},
  adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}

another example:

@InProceedings{ alejandro_weinstein-proc-scipy-2016,
  author    = { {A}lejandro {W}einstein and {W}ael {E}l-{D}eredy and {S}téren {C}habert and {M}yriam {F}uentes },
  title     = { {F}itting {H}uman {D}ecision {M}aking {M}odels using {P}ython },
  booktitle = { {P}roceedings of the 15th {P}ython in {S}cience {C}onference },
  pages     = { 1 - 6 },
  year      = { 2016 },
  editor    = { {S}ebastian {B}enthall and {S}cott {R}ostrup }
}

then use these latex commands inside your org file

#+LaTeX_HEADER: \usepackage[natbib]{biblatex}

#+LATEX_HEADER: \bibliographystyle{plain}

#+LATEX_HEADER: \bibliography{test-bib-refs}

Tests in org file

Latex

\cite{2011ApJS..192....9T}.

This is test2

\cite{alejandro_weinstein-proc-scipy-2016}.

\printbibliography

*Note: put \printbibliography at the end so as to print the references section last on your text.

Emacs-G++

G++ Hello_World

Create a folder hello_world. cd in this folder and create a file hello_world.cpp

copy the below code and paste it into hello_world.cpp file

/*hello_world.cpp*/

#include <iostream>
using namespace std;

int main() {
    cout << "hello world!" << endl;
    return 0;
}

The next step is to create a makefile.

Use the bellow code.

=hello_world: hello_world.cpp

[TAB]g++ -Wall -g hello_world.cpp -o hello_world=

then go back (M-x b) to the hello_world.cpp and press =C-x c= to run the programm.

you will see a 'hello world' message.

Create and Run a programm with multiple source files

On emacs create a folder [use dired to the direction you want to create the folder and press =Shift-+= and give a name] then dired to the folder and create 5 source files:

Compiling source code

Compile its source code using the -c flag with the compiler.

i.e.

% g++ -c main.cpp
% g++ -c Point.cpp
% g++ -c Line.cpp

This process generates object files .o

The next step is to link the object files into an executable

% g++ -o main main.o Point.o Line.o

makefile

Create a makefile

open emacs and dired to the folder with the .cpp and .o files and create a makefile. This file genereates the executable main

The one part is to set the variables you need to use in a simple makefile. These are variables which specify the C++ compiler and linker, as well as, flags for the compiler, etc.

Compiler

Comp g++=

Compiler Flags

CompFL -Wall -g=

and the other part is to set the targets which can be files to be generated.

#Targets needed to bring the executable up to date

main: main.o Point.o Rectangle.o

[TAB] $(Comp) $(CompFL) -o main main.o Point.o Line.o

# If we haven't set the variables Comp and CompFL the code would be like= =# this:= =# g++ -Wall -g -o main main.o Point.o Line.o

# The main.o target can be written more simply

main.o: main.cpp Point.h Line.h

=[TAB] $(Comp) $(CompFL) -c main.cpp= <<<<<<< HEAD

Point.o: Point.h

Line.o: Line.h Point.h

To comment a make file use #

After creating the makefile choose the main.cpp file and press C-x c= and then hit =enter to begin the compilation.

Clone the below repository to create a tetraedron and also to see an example of how to set up a makefile for OpenGL and GLUT framewroks.

SSH: git@github.com:Vasileios/Gpp.git

cmake

Click the below links to download and install sfml:

SFML download

SFML MAC osc

You can also read the instructions of building SFML with cmake:

Build SFML project

Clone this repository and run the famous cellular automata Conway's Game of Life.

On terminal, cd to the folder and use this command:

cmake -G"Xcode" -DSFML=/usr/local/include

Running 3d graphics g++

clone this repo

https://github.com/scanberg/particle-skinning

Requirements - glm, assimp, qt5

brew install qt5
brew install glm
brew install assimp

open main.cpp on emacs and hit C-x c to make the program.

iPython

Choose Python to see the code.

Choose Python to see the code.

Introduction to Python (https://www.python.org/doc/).

Python

# Python 3: Fibonacci series up to n
 def fib(n):
     a, b = 0, 1
     while a < n:
         print(a, end=' ')
         a, b = b, a+b
     print()
 fib(1000)

IPython-notebook

For more info see: https://ipython.org

Two other key components are Jupyter Notebooks and Anaconda. Jupyter provides Mathematica like notebooks and Anaconda is a package management system.

Jupyter Notebooks, originally called IPython Notebooks,and it commonly used for improving the reproducibility and accessiblity of scientific research.

Other math/science/data oriented Python tools

Install ipython on emacs:

First install anaconda: https://www.continuum.io/downloads check your python version in terminal python --version i.e 3.5, and download anaconda3.

After downloaded anaconda open terminal and cd to anacoda3 directory and type:

Choose Shell to see the code

bash Anaconda3-4.3.0-MacOSX-x86_64.sh

press yes for anaconda3 to add the PATH to your .bash_profile

The next step is to:

copy ein.el and ein.py to your emacs upload directory

Choose emacs-lisp to see the code

(require 'ein)

Start IPython notebook server. Go to terminal and write: jupyter notebook then copy the token and paste it as the password to login to the server.

On emacs hit M-x ein:notebooklist-login and press return to use the localhost:8888, server and use the token (password) to login.

i.e password: 8b6cae64f7dbcfc425a2dsf30cretfdfc7d730dcba9180ab8

Term output example

Choose Shell to see the output

[I 01:49:54.596 NotebookApp] Serving
notebooks from local directory: /Users/usr_name
[I 01:49:54.596 NotebookApp] 0 active kernels
[I 01:49:54.597 NotebookApp] The Jupyter Notebook is running at:
http://localhost:8888/?token8b6cae64f7dbcfc425a2dsf30cretfdfc7d730dcba9180ab8
[I 01:49:54.597 NotebookApp] Use Control-C to stop this server and shut
down all kernels (twice to skip confirmation).
[C 01:49:54.626 NotebookApp]

    Copy/paste this URL into your browser when you connect for the first time,
    to login with a token:

  http://localhost:8888/?token=8b6cae64f7dbcfc425a2dsf30cretfdfc7d730dcba9180ab8

If you successfully logged in to the server;

Hit M-x ein:notebooklist-open to open the notebook list. This will open a notebook list buffer.

In the notebook list buffer, you can open notebooks by hitting [Open], [Dir] for directories, create new notebook [New notebook], delete notebook [Delete].

NOTE* You can also check ob-python package for *source code block ipython in org-mode

You can start testing ipython using these examples: particle-physics-playground-playground-52de62d

CERN examples: particle-physics-playground-playground-52de62d

Sonifying ems (muons) - ipython - SuperCollider in emacs

Editor: Emacs Version 24.5 (9.0)

Ipython package ein on MELPA

SuperCollider 3.7

Data sonification experiment on particle-physics-playground.

For more info see here:

particle-physics-playground

to send osc messages to other application install python-osc library

In this case I use SuperCollider port 57120

Choose SuperCollider

// BA 28022017
// Testing osc communication - Receiving data from ipython - 'CMS' (Compact Muon Solenoid) __

s.boot // boot the server
s.record // record
s.stopRecording // stop recording


// create synthdef
(
SynthDef(\ipythontest, {|
	freq = 440, gate = 1,
amp = 0.5, out = 0|
	var env, source;

	env = EnvGen.kr(Env.adsr, 1, doneAction:2);

	//source = SinOsc.ar(freq*2, 0, amp);
	source = SinOsc.ar(SinOsc.ar(freq*2, freq*4, freq*2), 0,  amp);
	// source  = UseWhateverGen.ar();


	Out.ar(out, Pan2.ar(source*env, 0))!2
}).add;

~x=Synth(\ipythontest, [\freq, 440, \amp, 0.5]); // run the synth


// set osc

~a = OSCdef(\oscTest,
	{
			| ... msg | msg.postln;

			~x.set(\freq, msg, \amp, 0.9);
		//~muons = msg [0] [1..];
		//~muons.postln;



		// use the osc messages (msg) for the frequency
	},
	'/print' // OSCmessage name
);
)

ipython notebook

Choose Python

#VA_exp_280217_001


#Import libraries numpy, matplotlib, pythonosc

In [1]
import numpy as np
import matplotlib.pylab as plt

from IPython import get_ipython
get_ipython().run_line_magic('matplotlib', 'inline')

In [2]
#from __future__ import print_function
#from __future__ import division
import sys

sys.path.append("../particle-physics-playground-Sonification-Example_001/tools/")

#from draw_objects3D import *
import cms_tools as cms
In [3]
infile = open('../particle-physics-playground-Sonification-Example_001/data/small_cms_test_file.dat')

collisions = cms.get_collisions(infile)

number_of_collisions = len(collisions)
print ("# of proton-proton collisions: %d" % (number_of_collisions))


# of proton-proton collisions: 10


In [4]
print (collisions[0])


[[[88.9127, 32.9767, -75.1939, 29.541, -1.0], [79.2211, -58.6558, 49.1723, 13.5915, -1.0], [43.313, -5.9129, 40.0892, 12.0431, -1.0], [274.8094, -21.4194, 27.5639, -272.4152, -1.0], [26.6201, 0.5268, -24.7563, -7.4046, 0.0]], [[15.7375, 1.4793, -15.2566, -3.5645, -1]], [], [[52.364, 17.4983, -45.4233, 19.3009], [10.2904, -1.4633, 10.0887, 1.4035]], [44.9415, 0.422]]


In [5]

print (len(collisions[0]))

5


In [6]
METx = collisions[0][4][0]
METy = collisions[0][4][1]

print ("MET x: %f" % (METx))
print ("MET y: %f" % (METy))

MET x: 44.941500
MET y: 0.422000


In [7]
print ("# of jets:      %d" % (len(collisions[0][0])))
print ("# of muons:     %d" % (len(collisions[0][1])))
print ("# of electrons: %d" % (len(collisions[0][2])))
print ("# of photons:   %d" % (len(collisions[0][3])))

# of jets:      5
# of muons:     1
# of electrons: 0
# of photons:   2


In [8]
jets,muons,electrons,photons,met = collisions[0]


In [9]
E,px,py,pz,btag = jets[0]
print ("E:     %8.4f" % (E))
print ("px:    %8.4f" % (px))
print ("py:    %8.4f" % (py))
print ("pz:    %8.4f" % (pz))
print ("btag:  %8.4f" % (btag))

E:      88.9127
px:     32.9767
py:    -75.1939
pz:     29.5410
btag:   -1.0000


In [10]
E,px,py,pz,q = muons[0]
print ("E:  %8.4f" % (E))
print ("px: %8.4f" % (px))
print ("py: %8.4f" % (py))
print ("pz: %8.4f" % (pz))
print ("q:  %8.4f" % (q))

E:   15.7375
px:   1.4793
py: -15.2566
pz:  -3.5645
q:   -1.0000


In [11]
E,px,py,pz = photons[0]
print ("E:  %8.4f" % (E))
print ("px: %8.4f" % (px))
print ("py: %8.4f" % (py))
print ("pz: %8.4f" % (pz))

E:   52.3640
px:  17.4983
py: -45.4233
pz:  19.3009


In [0]
      # Plot the quantities
plt.figure(figsize=(16,4))

plt.subplot(1,3,1)
plt.hist(njets,bins=5,range=(0,5))
plt.xlabel(r'# of jets')
plt.ylabel('# entries')

plt.subplot(1,3,2)
plt.hist(jets_E,bins=25,range=(0,400))
plt.xlabel(r'Jet energy [GeV]')
plt.ylabel('# entries')

plt.subplot(1,3,3)
plt.hist(muons_E,bins=25,range=(0,400))
plt.xlabel(r'Muon energy [GeV]')
plt.ylabel('# entries')

h1:
Watch an example

In [0]
from IPython.display import YouTubeVideo
YouTubeVideo('UfimSbOr9to')

In [13]
infile = open('../particle-physics-playground-Sonification-Example_001/data/mc_dy_1000collisions.dat')


collisions = cms.get_collisions(infile)

# We will use these to store the quantities that we will be plotting later.
njets = []
jets_E = []
muons_E = []
photons_E = []

for collision in collisions:

    jets,muons,electrons,photons,met = collision

    njets.append(len(jets))

    for jet in jets:
        E,px,py,pz,btag = jet
        jets_E.append(px)

    for muon in muons:
        E,px,py,pz,q = muon
        muons_E.append(E)

    for photon in photons:
        E,px,py,pz = photon
        photons_E.append(E)
In [18]
import time

infile = open('../particle-physics-playground-Sonification-Example_001/data/mc_dy_1000collisions.dat')


collisions = cms.get_collisions(infile)

# We will use these to store the quantities that we will be plotting later.
njets = []
jets_E = []
muons_E = []
photons_E = []

for collision in collisions:

    jets,muons,electrons,photons,met = collision

    njets.append(len(jets))

    for jet in jets:
        E,px,py,pz,btag = jet
        jets_E.append(E )

    for muon in muons:
        E,px,py,pz,q = muon
        muons_E.append(E)

    for photon in photons:
        E,px,py,pz = photon
        photons_E.append(E)


             # Set up OSC here

from pythonosc import osc_message_builder
from pythonosc import udp_client

# The port for SuperCollider is '57120'

client = udp_client.SimpleUDPClient("127.0.0.1", 57120)


#client.send_message("/print", muons_E)

# now we can print them out too

for i in muons_E:
      print ("muon was: %d" % i)
      client.send_message("/print", i)
      time.sleep(0.015)

for i in jets_E:
      print ("jet was: %d" % i)
      client.send_message("/print", i)
      time.sleep(0.015)

for i in photons_E:
      print ("photon was: %d" % i)
      client.send_message("/print", i)
      time.sleep(0.015)
# # Plot the quantities

YT visualisation

An example with enzo data

Choose Python

import os

os.chdir('/Users/experiments/yt_pics')

import yt

ds = yt.load("/Users/experiments/Enzo_64/DD0043/data0043")

sc = yt.create_scene(ds, lens_type='perspective')

# Get a reference to the VolumeSource associated with this scene
# It is the first source associated with the scene, so we can refer to it
# using index 0.
source = sc[0]

# Set the bounds of the transfer function
source.tfh.set_bounds((3e-31, 5e-27))

# set that the transfer function should be evaluated in log space
source.tfh.set_log(True)

# Make underdense regions appear opaque
source.tfh.grey_opacity = True

# Plot the transfer function, along with the CDF of the density field to
# see how the transfer function corresponds to structure in the CDF
source.tfh.plot('transfer_function.png', profile_field='density')

# save the image, flooring especially bright pixels for better contrast
sc.save('rendering2.png', sigma_clip=6.0)

For 3D modeling yt see here:

Data Visualisation 3D

Ipython - realtime data

Watching the number of flights on your emacs:

This experiment tested on python 3.5 and emacs - ipython notebook (ein).

For ipython notebook installation see this webpage ipython.

To run this example you need to install some external modes

requests and BeautifulSoup

If you use pip (recommended) open the terminal and type

Choose Shell

$ pip install requests
$ pip install beautifulsoup4

Go to the web page to scrape the number of flights

https://www.flightradar24.com/56.16,-49.51/7

The number is updated every 8 seconds.

To be able to collect the number of flights in real time, go and find the .js file in the webpage. To find the js file go to: Chrome - more tools- developer tools - network - there you'll find the requests under the name feed.js.

Now, run the below code in you ipython notebook. (code taken from here)

Choose Python

import requests
from bs4 import BeautifulSoup
import time

def get_count():
    url = "https://data-live.flightradar24.com/zones/fcgi/feed.js?bounds=59.09,52.64,-58.77,-47.71&faa=1&mlat=1&flarm=1&adsb=1&gnd=1&air=1&vehicles=1&estimated
=1&maxage=7200&gliders=1&stats=1"

    # Request with fake header, otherwise you will get an 403 HTTP error
    r = requests.get(url, headers={'User-Agent': 'Mozilla/5.0'})

    # Parse the JSON
    data = r.json()
    counter = 0

    # Iterate over the elements to get the number of total flights
    for element in data["stats"]["total"]:
        counter += data["stats"]["total"][element]

    return counter

while True:
    print(get_count())
    time.sleep(8)

Watch here a screen capture

openFrameworks

MSAFluids-Kinect

MSAFluids-data visualisation

Choose C++

ofApp.h

#pragma once


#include "MSAFluid.h"
//#include "MSATimer.h"
#include "ParticleSystem.h"
#include "ofMain.h"
#include "ofxOpenCv.h"
#include "ofxXmlSettings.h"
#include "ofxUI.h"
#include "ofxGui.h"
#include "ofxOsc.h"
#define HOST "localhost"
#define PORT 12345

#define NUM_MSG_STRINGS 20


// comment this line out if you don't wanna use TUIO
// you will need ofxTUIO & ofxOsc
#define USE_TUIO

// comment this line out if you don't wanna use the GUI
// you will need ofxSimpleGuiToo, ofxMSAInteractiveObject & ofxXmlSettings
// if you don't use the GUI, you won't be able to see the fluid parameters
#define USE_GUI


#ifdef USE_TUIO
#include "ofxTuio.h"
#define tuioCursorSpeedMult				0.5	// the iphone screen is so small, easy to rack up huge velocities! need to scale down
#define tuioStationaryForce				0.001f	// force exerted when cursor is stationary
#endif


#ifdef USE_GUI
#include "ofxSimpleGuiToo.h"
#endif

// uncomment this to read from two kinects simultaneously
//#define USE_TWO_KINECTS

class ofApp : public ofBaseApp {
public:

    void setup();
    void setupGui();
    void update();
    void drawGui(ofEventArgs & args);
    void draw();
    void exit();
    void keyPressed(int key);
    //void mouseMoved(int x, int y );

   // void mouseDragged(int x, int y, int button);
    void mousePressed(int x, int y, int button);
    void mouseReleased(int x, int y, int button);
    void windowResized(int w, int h);
    void fadeToColor(float r, float g, float b, float speed);
    void addToFluid(ofVec2f pos, ofVec2f vel, bool addColor, bool addForce);

    ofVboMesh mesh;
    ofEasyCam cam;
    //ofxAssimpModelLoader model;
    ofLight light;


    float                   colorMult;
    float                   velocityMult;
    int                     fluidCellsX;
    bool                    resizeFluid;
    bool                    drawFluid;
    bool                    drawParticles;

    msa::fluid::Solver      fluidSolver;
    msa::fluid::DrawerGl	fluidDrawer;

    ParticleSystem          particleSystem;

    ofVec2f                 pMouse;

    //Each frame take the number of blobs and create cursors at their centroids
    vector<ofVec2f> cursors ;
    float cursorXSensitivity ;
    float cursorYSensitivity ;
    bool bRestrictCursors ;
    float cursorBorderPadding ;

    bool  bFullscreen;
    bool  bShowControlPanel;
    bool bEpsCapture;

    bool bThreshWithOpenCV;
    bool bKinectOpen ;
    int nearThreshold;
    int farThreshold;

    float minBlobSize , maxBlobSize ;
    int maxCursors ;


    /////earh here
    ofImage texture;
    bool holdingbutton;
    int oldalpha;
    int newalpha;
    ofImage offimage;
    ofImage onimage;

    GLUquadricObj *quadric;
    /////

     int mouseX =0;

#ifdef USE_TUIO
    ofxTuioClient tuioClient;
#endif



    ofxCvColorImage colorImg;

    ofxOscReceiver          receiver;

    float oscX = 0.0;
    float oscY = 0.0;
    int fadeAmt = 0;

    //this holds all of our points
    vector<ofVec3f> points;
    //this keeps track of the center of all the points
    //ofxPanel gui;



    ofBlendMode blendMode;
    ofImage rainbow;
    ofTrueTypeFont 	vagRounded;
    string eventString;
    string timeString;

};

ofApp.cpp

#include "ofApp.h"


char sz[] = "[Rd9?-2XaUP0QY[hO%9QTYQ`-W`QZhcccYQY[`b";


float tuioXScaler = 1;
float tuioYScaler = 1;
//--------------------------------------------------------------
void ofApp::setup() {
    ofSetLogLevel(OF_LOG_VERBOSE);

    // turn on smooth lighting //
    ofSetSmoothLighting(true);

    //need this for alpha to come through
    ofEnableAlphaBlending();

 //OSC

    receiver.setup(PORT);

    cout << "listening for osc messages on port " << PORT << "\n";

    //ofSetFrameRate(60);

    //Fluids
    for(int i=0; i<strlen(sz); i++) sz[i] += 20;

    // setup fluid stuff
    fluidSolver.setup(100, 100);
    fluidSolver.enableRGB(true).setFadeSpeed(0.002).setDeltaT(0.5).setVisc(0.00015).setColorDiffusion(0);
    fluidDrawer.setup(&fluidSolver);

    fluidCellsX			= 150;

    drawFluid			= true;
    drawParticles		= true;
    ofSetFrameRate(60);
    ofBackground(0);
    ofSetVerticalSync(false);


#ifdef USE_TUIO
    tuioClient.start(3333);
#endif


    windowResized(ofGetWidth(), ofGetHeight());		// force this at start (cos I don't think it is called)
    pMouse = msa::getWindowCenter();
    resizeFluid			= true;

    //ofEnableAlphaBlending();
    ofSetBackgroundAuto(false);


    ///////------mesh here
    /*
    mesh.setMode(OF_PRIMITIVE_TRIANGLE_STRIP);
    mesh.enableColors();

    ofVec3f  v0(-100, -100, -100);
    ofVec3f v1(100, -100, -100);
    ofVec3f v2(100, 100, -100);

    mesh.addVertex(v0);
    mesh.addColor(ofFloatColor(0.0, 0.0, 0.0));

    mesh.addVertex(v1);
    mesh.addColor(ofFloatColor(1.0, 0.0, 0.0));

    mesh.addVertex(v2);
    mesh.addColor(ofFloatColor(1.0, 1.0, 0.0));

    */

    ///---sphere here
    ofDisableArbTex();
    //ofLoadImage(texture, "earthTex.jpg");
    texture.load("earthTex.jpg");

    //this makes sure that the back of the model doesn't show through the front
    ofEnableDepthTest();
    // sphere.setRadius( width );
    //prepare quadric for sphere
    quadric = gluNewQuadric();
    gluQuadricTexture(quadric, GL_TRUE);
    gluQuadricNormals(quadric, GLU_SMOOTH);
    /////////////////////////

}

//---------------------------------------


void ofApp::fadeToColor(float r, float g, float b, float speed) {
    glColor4f(r, g, b, speed);
    ofDrawRectangle(0, 0, ofGetWidth(), ofGetHeight());

}


// add force and dye to fluid, and create particles
void ofApp::addToFluid(ofVec2f pos, ofVec2f vel, bool addColor, bool addForce) {
    float speed = vel.x * vel.x  + vel.y * vel.y * msa::getWindowAspectRatio() * msa::getWindowAspectRatio();    // balance the x and y components of speed with the screen aspect ratio
    if(speed > 0) {
        pos.x = ofClamp(pos.x, 0.0f, 1.0f);
        pos.y = ofClamp(pos.y, 0.0f, 1.0f);

        int index = fluidSolver.getIndexForPos(pos);

        if(addColor) {
            //			Color drawColor(CM_HSV, (getElapsedFrames() % 360) / 360.0f, 1, 1);
            ofColor drawColor;
            drawColor.setHsb((ofGetFrameNum() % 255), 255, 255);

            fluidSolver.addColorAtIndex(index, drawColor * colorMult);

            if(drawParticles)
                particleSystem.addParticles(pos * ofVec2f(ofGetWindowSize()), 10);
        }

        if(addForce)
            fluidSolver.addForceAtIndex(index, vel * velocityMult);

    }
}





//--------------------------------------------------------------
void ofApp::setupGui(){




    float dim = 24.0;

    gui.addSlider("fluidCellsX", fluidCellsX, 20, 400);
    gui.addButton("resizeFluid", resizeFluid);
    gui.addSlider("colorMult", colorMult, 0, 100);
    gui.addSlider("velocityMult", velocityMult, 0, 100);


    gui.addSlider("fs.viscocity", fluidSolver.viscocity, 0.0, 0.01);
    gui.addSlider("fs.colorDiffusion", fluidSolver.colorDiffusion, 0.0, 0.0003);
    gui.addSlider("fs.fadeSpeed", fluidSolver.fadeSpeed, 0.0, 0.1);
    gui.addSlider("fs.solverIterations", fluidSolver.solverIterations, 1, 50);
    gui.addSlider("fs.deltaT", fluidSolver.deltaT, 0.1, 5);
    gui.addComboBox("fd.drawMode", (int&)fluidDrawer.drawMode, msa::fluid::getDrawModeTitles());
    gui.addToggle("fs.doRGB", fluidSolver.doRGB);
    gui.addToggle("fs.doVorticityConfinement", fluidSolver.doVorticityConfinement);
    gui.addToggle("fs.wrapX", fluidSolver.wrap_x);
    gui.addToggle("fs.wrapY", fluidSolver.wrap_y);

    gui.addToggle("drawFluid", drawFluid);
    gui.addToggle("drawParticles", drawParticles);

    gui.addSlider("tuioXScaler", tuioXScaler, 0, 2);
    gui.addSlider("tuioYScaler", tuioYScaler, 0, 2);


    //--
    gui.currentPage().setXMLName("ofxMSAFluidSettings.xml");
    gui.loadFromXML();
    gui.setDefaultKeys(true);
    gui.setAutoSave(true);
    gui.show();


    ofSetBackgroundColor(0);
}
//--------------------------------------------------------------
void ofApp::update() {
    //OSC receive from SuperCollider

    while (receiver.hasWaitingMessages()) {
        ofxOscMessage m;
        receiver.getNextMessage(m);

        cout << "got message from OSC\n";

        if (m.getAddress() == "/data"){

            cout << "message was data as expected\n";

            ofVec2f eventPos = ofVec2f(m.getArgAsFloat(0), m.getArgAsFloat(1));
            ofVec2f mouseNorm = ofVec2f(eventPos) / ofGetWindowSize();
            ofVec2f mouseVel = ofVec2f(eventPos - pMouse) / ofGetWindowSize();
            addToFluid(mouseNorm, mouseVel, true, true);
            pMouse = eventPos;


        } else if (m.getAddress() == "/vertex") {
            cout << "message was vertex as expected\n";

        }
    }

        //Reset the cursors
        cursors.clear() ;

        if(resizeFluid) 	{
            fluidSolver.setSize(fluidCellsX, fluidCellsX / msa::getWindowAspectRatio());
            fluidDrawer.setup(&fluidSolver);
            resizeFluid = false;
        }


        fluidSolver.update();

}

//--------------------------------------------------------------
void ofApp::draw() {

    ofEnableAlphaBlending() ;

    ofPushMatrix() ;

    ////////////////////////////////////////////////
    for(int i = 1; i < points.size(); i++){

        //find this point and the next point
        ofVec3f thisPoint = points[i-1];
        ofVec3f nextPoint = points[i];

        //get the direction from one to the next.
        //the ribbon should fan out from this direction
        ofVec3f direction = (nextPoint - thisPoint);

        //get the distance from one point to the next
        float distance = direction.length();

        //get the normalized direction. normalized vectors always have a length of one
        //and are really useful for representing directions as opposed to something with length
        ofVec3f unitDirection = direction.getNormalized() + 0.1f ;

        //find both directions to the left and to the right
        ofVec3f toTheLeft = unitDirection.getRotated(-90, ofVec3f(0,0,1));
        ofVec3f toTheRight = unitDirection.getRotated(90, ofVec3f(0,0,1));

        //use the map function to determine the distance.
        //the longer the distance, the narrower the line.
        //this makes it look a bit like brush strokes
        float thickness = ofMap(distance, 0, 60, 40, 10, true);

        //calculate the points to the left and to the right
        //by extending the current point in the direction of left/right by the length
        ofVec3f leftPoint = thisPoint+toTheLeft*thickness;
        ofVec3f rightPoint = thisPoint+toTheRight*thickness;

        //add these points to the triangle strip

        mesh.addVertex(ofVec3f(leftPoint.x, leftPoint.y, leftPoint.z));
        mesh.addVertex(ofVec3f(rightPoint.x, rightPoint.y, rightPoint.z));

        mesh.addColor ( ofColor::fromHsb( sin ( (float)i ) * 40.0f + 128.0f, 255.0f , 255.0f ) ) ;
        mesh.addColor ( ofColor::fromHsb( sin ( (float)i ) * 40.0f + 128.0f, 255.0f , 255.0f ) ) ;

        }


    if(drawFluid) {
        ofClear(0);
        glColor3f(1, 1, 1);
        fluidDrawer.draw(0, 0, ofGetWidth(), ofGetHeight());
    } else {
        //		if(ofGetFrameNum()%5==0)
        fadeToColor(0, 0, 0, 0.01);
    }
    if(drawParticles)
        particleSystem.updateAndDraw(fluidSolver, ofGetWindowSize(), drawFluid);

    //ofDrawBitmapString(sz, 50, 50);


    //earth here
    int alpha = 120; // amount of smoothing
    ofEnableAlphaBlending();
    ofSetColor(255, 255, 255, alpha);
    ofTranslate(ofGetWidth()/2, ofGetHeight()/2, 0);

    ofRotateY(ofGetFrameNum());
    ofRotateX(-90); //north pole facing up

    //bind and draw texture
    texture.getTexture().bind();
    gluSphere(quadric, 200, 100, 100);
    texture.draw(0, 0);

    ofDisableAlphaBlending();
    ofPopMatrix();



   }


//-------------------------------------------------------------
void ofApp::drawGui(ofEventArgs & args){
    gui.draw();
}


//--------------------------------------------------------------

//--------------------------------------------------------------
void ofApp::exit() {

}

//--------------------------------------------------------------
void ofApp::keyPressed (int key) {

    switch(key) {
        case '1':
            fluidDrawer.setDrawMode(msa::fluid::kDrawColor);
            break;

        case '2':
            fluidDrawer.setDrawMode(msa::fluid::kDrawMotion);
            break;

        case '3':
            fluidDrawer.setDrawMode(msa::fluid::kDrawSpeed);
            break;

        case '4':
            fluidDrawer.setDrawMode(msa::fluid::kDrawVectors);
            break;

        case 'd':
            drawFluid ^= true;
            break;

        case 'p':
            drawParticles ^= true;
            break;

        case 'f':
            ofToggleFullscreen();
            break;

        case 'r':
            fluidSolver.reset();
            break;

        case 'k':
            bKinectOpen ^=true;
            break;

        case 'w':{
                    // texture.draw(0,0);

            ofEnableAlphaBlending();
            texture.getTextureReference().bind();
            gluSphere(quadric, 200, 100, 100);
            texture.draw(0,0);
            if (holdingbutton) {
                newalpha = oldalpha-1;
                if (newalpha <0 ) {newalpha = 0;}
                ofSetColor(255,255,255,newalpha);
                oldalpha = newalpha;
            }
            else { ofSetColor(255,255,255,255);}
            texture.draw(0,0);
            ofDisableAlphaBlending();
        }

        case 'e':{
            // texture.draw(0,0);

            ofEnableAlphaBlending();
             texture.draw(0,0);
            texture.getTextureReference().bind();
            gluSphere(quadric, 200, 100, 100);
            if (holdingbutton) {
                newalpha = oldalpha+1;
                if (newalpha > 255) {newalpha = 255;}
                ofSetColor(255,255,255,newalpha);
                oldalpha = newalpha;
            }
            else { ofSetColor(255,255,255,0);}
            texture.draw(0,0);
            ofDisableAlphaBlending();
        }



        case 'b': {
            //			Timer timer;
            //			const int ITERS = 3000;
            //			timer.start();
            //			for(int i = 0; i < ITERS; ++i) fluidSolver.update();
            //			timer.stop();
            //			cout << ITERS << " iterations took " << timer.getSeconds() << " seconds." << std::endl;
        }
            break;


        case 'o':
            bShowControlPanel = !bShowControlPanel;
            if (bShowControlPanel){
                gui.show();
                ofShowCursor();
            } else {
                gui.hide();
                ofShowCursor();
            }
            break;

        case 5:
            blendMode = OF_BLENDMODE_ALPHA;
            eventString = "Alpha";
            break;
        case 6:
            blendMode = OF_BLENDMODE_ADD;
            eventString = "Add";
            break;
        case 7:
            blendMode = OF_BLENDMODE_MULTIPLY;
            eventString = "Multiply";
            break;
        case 8:
            blendMode = OF_BLENDMODE_SUBTRACT;
            eventString = "Subtract";
            break;
        case 9:
            blendMode = OF_BLENDMODE_SCREEN;
            eventString = "Screen";
            break;
        default:
            break;


    }
}
//--------------------------------------------------------------
void ofApp::mousePressed(int x, int y, int button)
{}

//--------------------------------------------------------------
void ofApp::mouseReleased(int x, int y, int button)
{}

//--------------------------------------------------------------
void ofApp::windowResized(int w, int h)
{}

Micro-computing

Physical computing

Introduction to physical computing - microcontrollers. Programming Interactivity (Noble 2012): Chapter 4: Arduino.

Introduction to Raspberry Pi (https://www.raspberrypi.org/learning/hardware-guide/).

Interactive Performance.

Artistic methods and techniques which give to performers (usually dancers or musicians) control of their medium in real time.

wireless sensors found both in the body of an performer and in the theater.

Sensors

Use of sensors: touch, movement, elasticity, camera.

Programming Interactivity

experimenting with various sensors.

experimenting with Arduino and node.js

Arduino basic examples: (https://www.arduino.cc/en/Tutorial/BuiltInExamples).

Using Python-for OSC communication libraries

(https://pypi.python.org/pypi/python-osc) - with SuperCollider.

Raspberry Pi

Install Linux Raspbian Jessie Lite OS Image on the SD card

Download Raspbian here: raspbian

Follow the instructions here: installation-guide

Installing raspbian jessie-lite on raspberry

Import your SD card to your mac

identify the disk (not partition) of your SD card. e.g. disk4 (not disk4s1)

diskutil list

Go to check the disk number on About this mac->System report

Then open disk utilities choose the partision of the SD card and unamount it so as to let you erase it.

$ sudo dd if=raspbian jessie.img of=/dev/disk2 bs=1m
dd: /dev/disk2: Resource busy
$ diskutil umountDisk disk2
Unmount of all volumes on disk2 was successful
$ sudo dd if=ro519-rc6-1876M.img of=/dev/disk2 bs=1m

Then open terminal and write the following

$ sudo dd bs=1m if=path_of_your_image.img of=/dev/rdiskn

Remember to replace n with the number that you noted before!

use Activity Monitor on Mac to see the progress.

eject sd card and connect it to raspberry

Set up network

login from external screen and edit wpa_supplicant.conf

Also enabled SSH while having the raspberry pi on the external monitor...

open raspi config

Choose Shell

$ sudo raspi-config

Go to =Advanced Options= and enable ssh and reboot

When you try to connect using ssh pi@raspnberrypi.local you might come across with this warning

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@       WARNING: POSSIBLE DNS SPOOFING DETECTED!          @
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
The ECDSA host key for raspberrypi.local has changed,
and the key for the corresponding IP address uu
has a different value. This could either mean that
DNS SPOOFING is happening or the IP address for the host
and its host key have changed at the same time.
Offending key for IP in /Users/user/.ssh/known_hosts:7
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@    WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED!     @
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
IT IS POSSIBLE THAT SOMEONE IS DOING SOMETHING NASTY!
Someone could be eavesdropping on you right now (man-in-the-middle attack)!
It is also possible that a host key has just been changed.
The fingerprint for the ECDSA key sent by the remote host is
h. Please contact your system administrator.
Add correct host key in /Users/vasilis/.ssh/known_hosts to get rid of this
 message.
Offending ECDSA key in /Users/vasilis/.ssh/known_hosts:19
ECDSA host key for raspberrypi.local has changed and you have requested strict
 checking.
Host key verification failed.

To solve the problem try to delete previous keys in known_hosts file like pi@raspnberrypi.local

then try to login

$ ssh pi@raspnberrypi.local

and type yes to accept a new permanent key for local.

After that go and update

$ pi@raspberrypi:~ $ sudo apt-get update

and upgrade

$ pi@raspberrypi:~ $ sudo apt-get upgrade

then you can start installing software such as SuperCollider, emacs etc.

Ethernet set up

Building from Source on Raspberry

- connect an ethernet cable from the network router or your computer to the rpi

- insert the sd card and usb soundcard

- last connect usb power from a 5V@1A power supply

Install emacs

Choose Shell

pi@raspberrypi:~ $ sudo apt-get install emacs

quick installation of prelude emacs.

Install Git first;

pi@raspberrypi:~ $ sudo apt-get install git
pi@raspberrypi:~ $ curl -L https://git.io/epre | sh

The Meta key in terminal emacs is the ESC key So press ESC-x for Meta-x

next step is to install supercollider in emacs

Install SupeCollider on RPI

Installation guide from: Building SC on RPI

Choose Shell

$ ssh pi@raspberrypi.local #from your laptop, default password is raspberry
$ sudo raspi-config #change password, expand file system, reboot and log in again with ssh

update the system, install required libraries & compilers

$sudo apt-get update

$sudo apt-get upgrade

$sudo apt-get install alsa-base libicu-dev libasound2-dev libsamplerate0-dev libsndfile1-dev libreadline-dev libxt-dev libudev-dev libavahi-client-dev libfftw3-dev cmake git gcc-4.8 g++-4.8

compile & install jackd (no d-bus)

$git clone git://github.com/jackaudio/jack2 --depth 1
$cd jack2
$./waf configure --alsa #note: here we use the default gcc-4.9
$./waf build
$sudo ./waf install
$sudo ldconfig
$cd ..
$rm -rf jack2
$sudo nano /etc/security/limits.conf #and add the following two lines at the end
    * @audio - memlock 256000
    * @audio - rtprio 75
 exit #and log in again to make the limits.conf settings work

compile & install sc master

$git clone --recursive git://github.com/supercollider/supercollider
#optionally add –depth 1 here if you only need master
$cd supercollider
$git submodule init && git submodule update
$mkdir build && cd build
$export CC=/usr/bin/gcc-4.8 #here temporarily use the older gcc-4.8
$export CXX=/usr/bin/g++-4.8
$cmake -L -DCMAKE_BUILD_TYPE="Release" -DBUILD_TESTING=OFF -DSSE=OFF -DSSE2=OFF
-DSUPERNOVA=OFF -DNOVA_SIMD=ON -DNATIVE=OFF -DSC_ED=OFF
-DSC_WII=OFF -DSC_IDE=OFF -DSC_QT=OFF -DSC_EL=OFF -DSC_VIM=OFF
-DCMAKE_C_FLAGS="-mtune=cortex-a7 -mfloat-abi=hard -mfpu=neon
-funsafe-math-optimizations"
-DCMAKE_CXX_FLAGS="-mtune=cortex-a7 -mfloat-abi=hard -mfpu=neon
-funsafe-math-optimizations" ..
$make -j 4 #leave out flag j4 on single core rpi models
$sudo make install
$sudo ldconfig
$cd ../..
$rm -rf supercollider
$sudo mv /usr/local/share/SuperCollider/SCClassLibrary/Common/GUI
/usr/local/share/SuperCollider/SCClassLibrary/scide_scqt/GUI
$sudo mv /usr/local/share/SuperCollider/SCClassLibrary/JITLib/GUI
/usr/local/share/SuperCollider/SCClassLibrary/scide_scqt/JITLibGUI

start jack & sclang & test

$jackd -P75 -dalsa -dhw:1 -p1024 -n3 -s -r44100 &
#edit -dhw:1 to match your soundcard. usually it is 1 for usb, or,jackd -P75-dalsa -dhw:UA25EX -p1024
-n3 -s -r44100 &
$sclang #should start sc and compile the class library with
only 3 harmless class overwrites warnings
    $s.boot #should boot the server
    $ a= {SinOsc.ar([400, 404])}.play #should play sound in both channels
    $ a.free
     {1000000.do{2.5.sqrt}}.bench #benchmark: ~0.89 for rpi2, ~3.1 for rpi1
    $ a= {Mix(50.collect{RLPF.ar(SinOsc.ar)});DC.ar(0)}.play#benchmark
    $ s.dump #avgCPU should show ~19% for rpi2 and ~73% for rpi1
    $ a.free
    $ 0.exit #quit sclang
$ pkill jackd #quit jackd

Run SuperCollider on emacs

raspberrypi_SuperCollider

create directory packages in ~/emacs.d/personal/

and mv /directory-of-scel/el/ to /packages-directory/

then write in the init.el file

Choose emacs-lisp

(add-to-list 'load-path "~/.emacs.d/personal/packages/el")
(require 'sclang)

Create an Extensions directory in =/usr/local/share/SuperCollider/=

and cp the sc directory from =~/supercollider/editors/scel/sc= to /usr/local/share/SuperCollider/Extensions/

Then type to the terminal

Choose Shell

pi@raspberrypi:~ $ jackd -P75 -dalsa -dhw:1 -p1024 -n3 -s -r44100 &
//pi@raspberrypi:~ $ scsynth -u 57110 &
pi@raspberrypi:~ $ emacs -sclang

Copy directories from mac to pi using terminal

$ scp -r /path/to/directory pi@raspberrypi:~/path/to/remote/dir

example:

$ scp -r /Users/path pi@raspberrypi:~/SC_Stuff

Open raspberrypi3 from emacs using TRAMP

C-x C-f and type Shell

/ssh:pi@raspberrypi:

then type your raspberry password

pass: *

and then dired freely, open files and programms using shell as well to run sclang

Edit and save files using tramp

C-x C-f and type

<p>
/ssh:pi@raspberrypi|sudo:root@pi@raspberrypi:
</p>

dired to your file, make changes and save it!!!

Copy files from Raspberrypi to mac and the opposite

scp /path/to/py/file pi@raspberrypi:~

Replace raspberrypi with the ip address of the Pi if using the hostname doesn't work.

or from the pi

<p>
scp macuser@macipaddress:/path/to/py/file ~
</p>

replace macuser and macipaddress with your mac user and mac's ip address.

https://www.raspberrypi.org/forums/viewtopic.php?t=35152&p=296946

The following syntax is used to rename files with mv:

mv (option) filename1.ext filename2.ext

Use rsync

On your mac go to the directory you want to sync using cd on your terminal and type the following command:

transfering files from raspberry to mac

Choose Shell

 Recordings git:(SuperCollider) ✗ rsync -avz -e ssh pi@192.168.1.96:/home/pi/.local/share/SuperCollider/Recordings/ Recordings/

and

voices git:(SuperCollider) ✗ rsync -avz -e ssh pi@192.168.1.96:~/sounds/voices/ voicesA/
pi@192.168.1.96's password:
receiving file list ... done
created directory Weaving-voices

transfering files from mac to raspberry

$ rsync -avP sounds/ pi@192.168.1.96:~/sounds/voices/voicesA/

Unfortunatelly it doesnt work with the name of the raspberry instead you need to find the ip of raspberry. To find the ip ssh to your raspberry and type:

$ sudo ifconfig

it will ask for the pass word:

pi@192.168.1.96's password:
receiving file list ... done
created directory Recordings
./
SC_161215_114846.aiff
SC_170422_120739.aiff
SC_170422_135403.aiff

sent 88 bytes  received 11223209 bytes  477587.11 bytes/sec
total size is 25854264  speedup is 2.30
➜  Recordings git:(SuperCollider)

Install Adafruit MPR121 on Raspberry

Soldering

Prepare the header strip, Add the breakout board, and solde

see more here

After you've soldered the sensor to the header strip move to wiring sesion:

Wiring

Wiring Place the MPR121 board into a breadboard and connect its inputs to the electrodes you plan to use. Then follow the wiring below for your platform to connect the MPR121 to an I2C communication channel. Raspberry Pi On a Raspberry Pi connect the hardware as follows. Note: Make sure you've enabled I2C on your Raspberry Pi!

First make sure that you've installed python.

Choose Shell

sudo apt-get update
sudo apt-get install build-essential python-dev python-smbus python-pip git

Then clone Adafruit_Python_MPR121.git

cd ~
git clone https://github.com/adafruit/Adafruit_Python_MPR121.git

and install it

cd Adafruit_Python_MPR121
sudo python setup.py install

Configuring I2C

I2C is a very commonly used standard designed to allow one chip to talk to another. So, since the Raspberry Pi can talk I2C we can connect it to a variety of I2C capable chips and modules. Here are some of the Adafruit projects that make use of I2C devices and modules:

https://learn.adafruit.com/adafruits-raspberry-pi-lesson-4-gpio-setup/configuring-i2c

Testing I2C

Now when you log in you can type the following command to see all the connected devices

Choose Shell

$ sudo i2cdetect -y 1

Usage

Choose Shell

cd examples
sudo python simpletest.py

These are the output values of 12 capacitive touch inputs

Chage permission to read-only files

Choose Shell

➜  ~ cd /Volumes
➜  /Volumes ls
Macintosh HD boot
➜  /Volumes cd boot
➜  boot ls
.
➜  boot sudo chmod a+w cmdline.txt
Password:
➜  boot

erase sd card

Open terminal and type:

Choose Shell

diskutil eraseVolume ExFAT MyName diskX

change diskX to your disk number; i.e. disk2s1

Suggested Bibliography

Books


  1. DEFINITION NOT FOUND [return]