In Second Life there are many people who make machinima of all types and genres. But two people in particular film live concerts, with very different approaches.
Glasz films the concert and works a lot in post-production, making it a personal work. D-oo-b instead makes real documentaries full of ideas and original shots, as well as a musician himself.
So now at The Hexagon there are two corners reserved for their creations, where you can see some of their work on a dedicated screen.
The screen used is really simple to use, just turn it on.
Next to each screen there is a small cube, click to receive the relevant documentation. Both have a large production, refer to notecards to have access to their archives on the web.
I think it’s an important completion of The Hexagons, and watching some machinima about musicians in Second Life is pleasant. Good vision.
For those who missed the unforgettable Halloween 2024 concert by nnoiz Papp at The Hexagons, the video recording is here! This exclusive video captures every moment of nnoiz’s extraordinary live performance, weaving modular synthesizers with live instruments—including a hauntingly beautiful oboe performance. Known for his inventive approach to sound design and deep musical expertise, nnoiz Papp left the audience spellbound.
From his renowned work with Sendung mit der Maus to his celebrated collaborations with artists like Klaus Schulze, nnoiz Papp’s career is a testament to innovation in sound. This video release is a must-watch for fans of experimental music and anyone fascinated by live electronic and instrumental fusion. Don’t miss the chance to experience his Halloween concert’s atmosphere, intensity, and raw creativity.
nnoiz Papp
// scoremusic-composer / music for animated movies / sound-designer / keyboarder / oboist / guitarist
1980-1986 Musikhochschule Köln (studying music for teaching in schools, oboe and (jazz-)piano)
since 1985 over 300 musicproductions for german children´s TV (Sendung mit der Maus) since 2007 thousends of trailers, songs, sound-design projects for TV (Sendung mit dem Elefanten)
since 1982 working as an studio musician with oboe (for example Klaus Schulze) and keyboards (since 1984 with computers) in many different styles (from pop to heavy metal with U.D.O. and AXXIS)
18 CD´s productions (4 under the pseudonym “SVENSSON” – electronic & oboe, 4 archivmusic cd´s for selected sound, Koch-music-library and sonoton)
live music with TRIOGLYZERIN, a trio thats playing live with old silent movies in cinemas (http://www.trioglyzerin.com)
since 1996 different internet activities (quicktime vr – flash – 3d)
2006 live-video-installation at “Wuppertaler Bühnen“, visualizing Nyman´s opera “The man who mistook his wife with a hat” with VJ software 2012 live-video-installation at “Wuppertaler Bühnen”, visualizing Ali Askin`s opera ISTANBUL
In SL since end of may 2007, trying to put the things together…..
2017 building up a synthesizer modular system
2022 starting and testing different AI tools for sound text and graphics
Taking advantage of the new location and the renewed attention, we would like to organize a small festival, refreshing the walls a bit and inviting all composers of SL’s own music to continue to send a photo and a note about themselves to Livio Korobase and/or Renee Rebane, so they can be added to the directory.
Many are already on the walls, but there is still room.
Some time ago I bought a software for VJ (VJing (pronounced: VEE-JAY-ing – is a broad designation for realtime visual performance). Characteristics of VJing are the creation or manipulation of imagery in realtime through technological mediation and for an audience, in synchronization to music.
NestDrop is an ingenious software based on the visualization system produced by the programming language called Milkdrop, originally developed by Ryan Geiss in 2001.
Milkdrop at work. On left, the script that build the image on right.
I like Milkdrop’s beat detection system, it works well and we all know how music and lights in sync can produce pleasant moments.
So, after playing on my own for a bit I said to myself why not try to do a music and light show in Second Life, live?
It’s not as simple as it seems, especially when it comes to Second Life for video. The support is rudimentary, all you can do is apply a URL to the face of an object and start playing it (this is called MOAP, media on a prim). However, this does not guarantee at all that everyone will participate in the event, because in the case of a film, for example, each user start a playback from the beginning and so if someone arrives late they will never be aligned with the other participants, who perhaps are not even aligned with each other. We would like to have a party, not watch a movie each on a separated sofa. Give us our keyframe.
There are systems in Second Life that try to overcome this problem with scripting and other systems, but they are unreliable and complex. How to do it in a transparent, simple and economical way?
A day i visited a sim, Museum Island, where a guy with nickname Cimafilo was streaming a movie, Alice in the wonderland, and i noticed that data stream was small but fluid, in sync (also if on sim was used the parcel media and was not possible understand who and what was managing the sync), so i tried to get some infos about. Cima was using a for me unknow video format, WebM and OBS, with good results in my opinion. So I tried to create a similar system, but suited to my needs: analogous workflow, but using a dedicated server and a higher frame rate.
I can say I succeeded, the system devised can stream and sync audio and video events in Second Life using simple open source tools. Let’s see how.
As mentioned, my goal was to use a VJ program to create an audiovisual event in Second Life.
All you need is any audio player (personally I use Winamp because it’s very light, but any other player is fine) and a VJ software (in my case NestDrop, but any other is fine). The result of our audio and video mix must be able to be captured by the OBS (Open Broadcaster Software), so any program that generates video output on your PC monitor is fine.
Your desktop at the end of first part of setup: from left, Winamp produces sound, NestDrop make the visuals and OBS capture all.
I was thinking to myself, how can I send this output from OBS to Second Life with acceptable quality and in sync for anyone? Showing the video is easy, almost, there are many methods by going from a web page or a streaming service but they all lack the detail that is so important to me: sync.
However, I noticed one thing: almost all of these services use MP4 container or its variants to distribute the content. And by studying, I realized that the MP4 container does not have a sync system suitable for my purpose. I need the sync to be sent at pre-established intervals during the projection. Codecs world is a real jungle managed through trade wars.
At this point I entered a hell of questions and answers on forums, web pages, approximate and/or wrong answers, you name it. I’m also no expert on these things, and this was new territory for me. Double difficulty then.
I convinced myself along the way that the secret was in the format, and indeed it was.
Arguing with FFMPEGspecs I discovered the format that’s right for me: compact, manages good audio (in Opus) and video (VP8 or VP9), open source, all browsers can view it: WebMwas the trick.
Above all, a specification of webM container shoot me: Key frames SHOULD be placed at the beginning of clusters.
Exactly what I was looking for, bingo. In a few words, when the Play key is pressed in the SL viewer, each visitor gets a key frame to sync at the exact point of the actual key in the video. Yes, I want to use this!
Okay, but where do I send this WebM stream if I manage to convince OBS to do broadcasting?
Read here, search there, a server that accepts WebM is Icecast2.
Now comes the slightly complex part, because you need an Icecast video hosting, which will work like Shoutcast hosting which – if you have ever played music in Second Life – you surely already know. Or, you can implement your own server there. I obviously chose the second path both for its affordability and to fully understand how it works.
On the Web it is easy to find offers of virtual servers, even at very low prices. For me I got a VPS with 1 processor, 1 GB RAM, 100 GB hard disk. For an experiment it’s fine, in any case you can always expand. We go.
The installation of a Linux distro is always quite automated, just choose the desired distro from a drop-down menu and in a few minutes the basic server will be operational. So far so good, and I chose Ubuntu 20.04 for myself.
Preparing my VPS
In a few minutes the server is ready. I chose not to install anything apart from Icecast, but obviously the space can be used for all that you need.
Installing Icecast2 is very simple, all you need is a little familiarity with the terminal (for me, Putty) commands and your server is ready in 10 minutes. You can find dozens of tutorials, all copy/pasted from each other.
The only detail that I recommend you take care of is to open port 8000 of the server firewall, or no one will be able to connect and set your audio card setting on Windows at 48000 Hz for the audio side. You don’t need anything else.
Now test if Icecast is responding with a browser, adding :8000 to your server address. In my case, for example: http://vps-94bab050.vps.ovh.net:8000/
The server will most likely be listening.
Pelican Village TV waiting for a mountpoint
When configuring Icecast (you will be asked to specify passwords for admin and other details), you may have a doubt about mountpoints, and the documentation isn’t very clear either: should I create them now? Or when? In reality you don’t have to do anything, OBS (or the software you use for broadcast) will create the mountpoint with the name you chose in the connection string (we see later how).
I connected OBS to server, and the connection created the mountpoint /video, as specified in connection string.
Opening the URL http://vps-94bab050.vps.ovh.net:8000/video (following the example) you can already see what you are going stream in Second Life in a browser window (if someone is streaming, or you get a “Error 404 page Not Found”). The mountpoint is dynamic, this means it is alive as long as the stream is up. When you disconnect your broadcaster, the mountpoint disappears.
To connect OBS to the Icecast server, the issue is not very simple but it can be solved.
In File > Settings > Output > Recordings, select as Output Mode Advanced. Type: Custom Output (FFmpeg) FFMpeg Output Type: Output to URL File part or URL: icecast://user name:password@server URL:8000/mountpoint name User name and password you have set installing Icecast2, remember? Container Format and Container format Description: webm Muxer Settings open a chapter apart. This specs and Encoder settings have do be decided carefully reading the documentation at https://trac.ffmpeg.org/wiki/EncodingForStreamingSites and https://developers.google.com/media/vp9/live-encoding Configurations depends on many factors based on your material, hardware, connection, server. For me personally the string that worked for Muxer Settings is:
(For each parameter meaning and some sample configurations, refer to specs)
To start the broadcasting, push Start Recording button in OBS (we are recording to a URL).
Now what you need is to prepare the “screen”. In Second Life, associate the URL of your mountpoint with the Media texture of object you have chosen as screen and press Play on the viewer. Nothing else is needed. Let the rave begin 🙂
What a strange title, right? I’ll try to explain better.
Some time ago, I received an invitation to play at this event:
Astral Stories presents Nino Vichan and Waveform in ✦ Fair is Foul, and Foul is Fair ✦ 12 pm SLT Saturday, 11 November art opening of Nino Vichan’s scenes from Macbeth 12 pm SLT Sunday, 17 December opera recital of Macbeth inspired pieces by Merryjest 12 pm SLT Saturday, 20 January live improvised music performances by Poppy Morris - Echo Starship – Daddio Dow – Livio Korobase
Could I say no?
And I agree about improvisation, but calmly. I’m not a musician who can improvise a lot since I’m not an octopus for now, and I like to use a lot of instruments. So first of all I prepare my work tools.
When Macbeth enters, he is horrified by what he has done. He has brought with him the daggers that he used on Duncan, instead of leaving them in the room with Duncan’s servants as Lady Macbeth had planned. When he finds himself incapable of returning the daggers, Lady Macbeth does so. She returns to find Macbeth still paralyzed with horror and urges him to put on his gown and wash the blood from his hands.
So I started studying the how and why, and for this event I prepared a palette of sounds and synths, all in-the-box. It is not a simple thing to “set” the scene of an opera and play it. So I’ve identified some tools to make the performance easier.
You certainly need a DAW, which for me is Presonus Studio One, which I will use as a looper and to organize sound changes and so on. It’s easy to create tracks and assign to each track what it needs, the correct VST, automations, etc. etc. One difficult thing to do is play and mix at the same time, which as it is a primarily sonic, sound-based event requires a lot of attention.
A really annoying thing is mixing with the mouse… but I have an iPad and Studio One has a dedicated mixer interface (Studio One Remote) which allows you to use faders with finger tips on a touch screen. Great, a true mixer.
Using Remote on a iPad for mixing.
Another tool that helps a lot in these events is the Studio One Rehearse and Perform template, made especially for these situations.
With Studio One 6 and the Rehearse and Perform smart template, Studio One allows you to get up and running incredibly quickly with a templated session – instead of sitting down, adding all of the elements you need for the performance, linking them together etc, the smart template sets it all up for you, making it easier to go from booting up to creating in a matter of seconds. Great and easy to use.
Performing live?
I add two empty audio track for play live (useless more, still have few hands) and done.
So far we are. But do we have everything? No, we’ll have to connect the DAW’s output to Shoutcast somehow, or no one will hear anything in world.
Personally the simplest and most effective solution I’ve found is a VST plugin called ShoutVST. It is available in 32 and 64 bit and the installation is really simple. Fill in the plugin interface with your stream data and when you press Connect on plugin and Play on the DAW will start sending your music to the Shoutcast server.
There are probably other solutions, but this one is free and works. If so, let me know yours in the comments.
Today is the 12th, still a few days left for the 20th, day of the show. I feel like I have it all, but who knows.
In case we’ll see you at the event, at Waveform, Fair is Foul, and Foul is Fair, 12 pm SLT Saturday, 20 January
I think the musicians here apart from me are among the best in Second Life. If you like music it’s a good opportunity to listen to something interesting.
Bsukmet begins his musical journey at the age of 6, attending piano and musical language classes at an academy with exams through the Toledo Conservatory. Since he was little, he has traveled to various locations giving piano concerts with a classical repertoire.
In 2006 he joined the Madrid gothic pop rock band Evil Thoughts, taking on the role of keyboardist and lead songwriter. Together with them he writes 2 albums, the second of which is composed entirely by him.
In 2010 he entered the virtual world of Second Life, where he began his career as a DJ, focusing on artistic events, such as the SL anniversary party, and other projects with diffusion in the real world such as the Nice Carnival.
In 2019 he abandons his role as a DJ to start showing his own creations in Second Life. Since then he has released 4 solo albums: “Zero”, “Third Demise”, “Shattered, Almost Detached” and “Imniali: Forced Perspective” and the “Silent Beauties” album, in collaboration with the violinist Fly Kugin.
Bsukmet mainly uses electronic and orchestral resources, but his style is eclectic, and his method consists in using exclusively free resources available to everyone and avoiding the use of sounds created by other composers, writing each piece note by note.
After almost a year of weekly sessions ‘Music Listening Meditation’, I can still be pleasantly surprised by the difference between listening on my own, and listening with some other people who have come together with the same intention. We listen.
I know that in real life some of them may be driving, some may be cleaning their kitchen, some could be sitting quietly on their chair. We are in different times of our day because of the time zones, we bring a different mood because we are different human beings. So it seems the intention to listen is maybe the “only” thing we have in common. We listen.
We always do two sessions of approximately 20 minutes within the meditation hour. During the second session I ask the participants to try and focus on the group. This can be as much as being aware that there are others around us that hear the same sounds, that listen to the same music, that experience the same vibrations in their body. I am not sure if this awareness can create a physical connection, but it can have an impact anyway.
For me, that impact is like this: I enjoy the company of familiar and new people, I am grateful that people care to join. At times I choose music that is a bit hard to digest, and then it helps to know we are not alone in going through that. Sometimes a visualisation of how sounds go around in the circle is enough for me to bring up a smile, or to sense a new tingle at the back of my neck. Often the music sounds more intense than when I am listening alone. I hear more when I listen better.
Music Listening Meditation will probably not cause major transitions, but it can give you one hour of awareness, of intentional and focused listening. You observe, and don’t judge. You don’t have to try too hard: just let it happen. And if something else happens, that is ok, too.
The 4th edition of this online micro festival, which is part of the “Transonic Second Life Sessions”. These events offer, in different Second Life venues, concerts and audio-visual performances by bringing together around a community of sound and multimedia artists, visitors who, in a spirit that is both playful and adventurous, discover or follow the evolution of international talents performing under various avatars.
Organisation and scenography: A Limb & Livio Korobase
This new series of performances by artists associated with the Transconic label in Second Life is also part of the NoLA – No Lockdown Art initiative of Transcultures and European Pepinieres of Creation since the first confinement due to the health crisis since March 2020.
As maybe someone remembers, some time ago we opened this space in Museum Islands, which aims to be a kind of directory of producers of their own music in Second Life. It is all explained in this earlier blog post. The hexagons has also been added to the Second Life destination guide.
We have started a series of concerts, the season opened by Tia Rungray and which will continue in July with a small festival with various musicians active in Second Life and RL. The artists who have already joined the initiative are the following (in no order):
Tia Rungray A Limb Yadleen Echo Starship Bsukmet & Imniali Daddio Dow Livio Korobase Art Olujia Cypress Rosewood nnoiz Papp Martin Dimitrov Music Mish Musicat Chouchou Poppy Morris SM2 Mao Lemieux Morlita Quan Space Invaders Aleatorica
I invite all business owners in Second Life to view their biographies at Hexagons, each portrait provides a folder with the artist’s name, his image and a bio.
I also invite interested musicians to send an image and a bio, so as to make the directory even more complete.
The Hexagonsit’s a great opportunity for musicians to showcase their work and for businesses to support local talent. I hope many will take the chance to get involved.
The doors are always open, and on the roof there is a space available for events and concerts, and also as a rehearsal room.