<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>Emute Lab</title>
    <description>A Music Informatics and Performance Technologies Lab based in the School of Media, Film and Music at the University of Sussex</description>
    <link>http://www.emutelab.org/</link>
    <atom:link href="http://www.emutelab.org/feed.xml" rel="self" type="application/rss+xml" />
    <pubDate>Fri, 02 Jan 2026 12:40:23 +0000</pubDate>
    <lastBuildDate>Fri, 02 Jan 2026 12:40:23 +0000</lastBuildDate>
    <generator>Jekyll v3.10.0</generator>
    
      <item>
        <title>Strange Pulse Toolkit: Building Chaotic Instruments in Max Workshop</title>
        <description>&lt;p&gt;&lt;strong&gt;:::: When: Jan 19th 4-6:00pm UK time (GMT). Where: Zoom ::::&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Are you interested in creating &lt;b&gt;chaotic instruments&lt;/b&gt; for exploration and fun in Max? We are inviting you to participate in a free workshop that will take place in Janurary 2026. The workshop will introduce the &lt;a href=&quot;https://github.com/MaxWorgan/StrangePulseToolkit&quot;&gt;Strange Pulse Toolkit&lt;/a&gt; - a package for the &lt;a href=&quot;https://cycling74.com/products/max&quot;&gt;Max&lt;/a&gt; software environment, which explores the rhythmical potential of &lt;i&gt;Strange Attractors&lt;/i&gt; - simple mathematical models that exhibit chaotic behaviour.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/img/spt-screenshot.png&quot; alt=&quot;Screenshot of SPT&quot; /&gt;
*A screenshot of the Strange Pulse Toolkit *&lt;/p&gt;

&lt;p&gt;The Strange Pulse Toolkit provides a suite of tools to enable the creation of instruments that embrace the chaotic nature of strange attractors. Rather than simply using a chaotic system as a &lt;i&gt;random&lt;/i&gt; modulator in an otherwise predictable system, the SPT encourages the embedding of chaos into the very  core of the instrument, exchanging absolute control for surprise and unpredictability.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Requirements&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The only requirements are some experience using Max, and a working installation of Max (ideally version 9, although this is not required). This workshop will be conducted over Zoom, so the ability to join the zoom call is also required.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Registration:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;b&gt;To sign-up and for further information about the workshop: &lt;a href=&quot;https://universityofsussex.eu.qualtrics.com/jfe/form/SV_dniKILyw7C4W7kO&quot;&gt;SPT Workshop Registration&lt;/a&gt;&lt;/b&gt;&lt;/p&gt;

&lt;p&gt;This video demonstrates a simple demo written with the SPT:&lt;/p&gt;
&lt;iframe width=&quot;560&quot; height=&quot;315&quot; src=&quot;https://www.youtube.com/embed/FF232wthxN0&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; referrerpolicy=&quot;strict-origin-when-cross-origin&quot; allowfullscreen=&quot;&quot;&gt;&lt;/iframe&gt;

&lt;p&gt;For a more technical information about the SPT, please refer to this article:&lt;/p&gt;

&lt;p&gt;Worgan, M. (2025). The Strange Pulse Toolkit. Proceedings of the 22nd Sound and Music Computing Conference (SMC2025), Graz, July 2025, 283–289. https://doi.org/10.5281/zenodo.15837615&lt;/p&gt;

&lt;p&gt;&lt;b&gt;Please register above and we hope to see you on Zoom on Jan 19th!&lt;/b&gt;&lt;/p&gt;

&lt;p&gt;Max&lt;/p&gt;

</description>
        <pubDate>Tue, 30 Dec 2025 00:00:00 +0000</pubDate>
        <link>http://www.emutelab.org/blog/SPT-workshop</link>
        <guid isPermaLink="true">http://www.emutelab.org/blog/SPT-workshop</guid>
        
        <category>emutelab</category>
        
        <category>workhop</category>
        
        <category>max</category>
        
        <category>chaos</category>
        
        
        <category>blog</category>
        
      </item>
    
      <item>
        <title>Emute Lab Instruments @ London Synth and Pedal Expo</title>
        <description>&lt;p&gt;Dimitris and Chris were presenting and London Synth and Pedal Expo 2025.&lt;/p&gt;

&lt;iframe width=&quot;560&quot; height=&quot;315&quot; src=&quot;https://www.youtube.com/embed/k5juqLq37rU?si=9tk_5_AWRJmev3LD&quot; title=&quot;YouTube video player&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; referrerpolicy=&quot;strict-origin-when-cross-origin&quot; allowfullscreen=&quot;&quot;&gt;&lt;/iframe&gt;

&lt;iframe width=&quot;560&quot; height=&quot;315&quot; src=&quot;https://www.youtube.com/embed/G8lr-LPybgM?si=wPBs3c4JTH-EyVJV&quot; title=&quot;YouTube video player&quot; frameborder=&quot;0&quot; allow=&quot;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share&quot; referrerpolicy=&quot;strict-origin-when-cross-origin&quot; allowfullscreen=&quot;&quot;&gt;&lt;/iframe&gt;

</description>
        <pubDate>Sat, 22 Mar 2025 00:00:00 +0000</pubDate>
        <link>http://www.emutelab.org/blog/lspxpo</link>
        <guid isPermaLink="true">http://www.emutelab.org/blog/lspxpo</guid>
        
        <category>eli</category>
        
        <category>useq</category>
        
        <category>modular</category>
        
        <category>trade show</category>
        
        
        <category>blog</category>
        
      </item>
    
      <item>
        <title>Emute Lab Meeting March: Live Coding</title>
        <description>&lt;p&gt;Emute Lab is meeting in March with presentations from Dimitris Kyriakoudis and Aydin Kuru focusing on the practice of live coding.&lt;/p&gt;

&lt;p&gt;Dimitris will be presenting the recent developments on the Eurorack live coding module that him and Chris Kiefer have been working on, specifically around integrations with the code editor and doing dynamic visualisations.&lt;/p&gt;

&lt;p&gt;Aydin will be presenting his ongoing experimentations regarding the role and functions of typing in the context of live coding through a demo performance exhibiting the gesture of typing as a sonic phenomenon.&lt;/p&gt;

&lt;p&gt;Everyone is welcome at this meeting, looking forward to seeing you.&lt;/p&gt;

&lt;p&gt;Date: Tuesday, March 11th
Time: 4pm
Location: Room 76 (back of Falmer Bar)&lt;/p&gt;
</description>
        <pubDate>Mon, 03 Mar 2025 00:00:00 +0000</pubDate>
        <link>http://www.emutelab.org/blog/meetingmarch.2025</link>
        <guid isPermaLink="true">http://www.emutelab.org/blog/meetingmarch.2025</guid>
        
        <category>emutelab</category>
        
        
        <category>blog</category>
        
      </item>
    
      <item>
        <title>May Meeting: Cell Oscillations and Entangled Instruments</title>
        <description>&lt;p&gt;This month we will have presentations from Shu Yang and Steve Symons.&lt;/p&gt;

&lt;p&gt;Shu will presenting audio-visual programmes (Max and Touch Designer) that can be used to simulate vascular cell oscillations based on its protein’s biochemical structures (AlphaFold 3, FPbase, and RCSB PDB).  The attached ‘cell’ image is the biomaterial he has created for a soft robotic system that simulates the cell using water, oil, gelatine and food dye.&lt;/p&gt;

&lt;p&gt;Steve will be presenting a proposal to use diffractive analysis to make sense the messy data that emerges when we interrogate multi-player instruments, such as the entangled instrument the Stickatron (see second attached image), from a post-human perspective.&lt;/p&gt;

&lt;p&gt;Everyone is welcome - we look forward to seeing you.&lt;/p&gt;

</description>
        <pubDate>Wed, 22 May 2024 00:00:00 +0000</pubDate>
        <link>http://www.emutelab.org/blog/lab-meeting-copy</link>
        <guid isPermaLink="true">http://www.emutelab.org/blog/lab-meeting-copy</guid>
        
        <category>meeting</category>
        
        
        <category>blog</category>
        
      </item>
    
      <item>
        <title>April Meeting: Audio Inventions Presentation and Demo</title>
        <description>&lt;p&gt;For out April meeting, we have special guests Paul and Brian from Audio Inventions, demonstrating their &lt;i&gt;Freedom Player&lt;/i&gt; system for woodwind instruments.
&lt;br /&gt;
&lt;a href=&quot;https://www.audio-inventions.co.uk/6/Home&quot;&gt;https://www.audio-inventions.co.uk/6/Home&lt;/a&gt;
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;Audio Inventions are located in the Sussex Innovation Centre on campus, and they are doing some fascinating work in instrument acoustics, machine listening and embedded sound synthesis.  They will be talking about their system and showing a demo.&lt;/p&gt;

&lt;p&gt;All welcome, from across campus.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;https://www.audio-inventions.co.uk/_data/site/196/folder/6/clarinetExploded.jpg&quot; /&gt;&lt;/p&gt;

&lt;p&gt;&lt;br /&gt;
&lt;i&gt;Until now there hasn’t been an effective way to silence a woodwind instrument because the sound comes out of any open finger holes as well as the bell, so trumpet-style mutes don’t work.&lt;/i&gt;&lt;/p&gt;

&lt;p&gt;Freedom Player™ is an electronic practice mute for wind instruments that uses patented technology to ‘blow’ the instrument for you with a high-tech stimulus that we call Digital Breath™.
&amp;lt;/i&amp;gt;
&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;Date: Tuesday 30th April&lt;/p&gt;

&lt;p&gt;Time: 1pm&lt;/p&gt;

&lt;p&gt;Location: Sussex Humanities Lab&lt;/p&gt;

&lt;p&gt;Zoom: https://universityofsussex.zoom.us/j/94541042594&lt;/p&gt;

</description>
        <pubDate>Wed, 17 Apr 2024 00:00:00 +0000</pubDate>
        <link>http://www.emutelab.org/blog/lab-meeting_audioinv-copy</link>
        <guid isPermaLink="true">http://www.emutelab.org/blog/lab-meeting_audioinv-copy</guid>
        
        <category>meeting</category>
        
        <category>woodwind</category>
        
        <category>industry</category>
        
        <category>machine listening</category>
        
        
        <category>blog</category>
        
      </item>
    
      <item>
        <title>March Lab Meeting: Strange Attractors and Livecoding Modular Synths</title>
        <description>&lt;p&gt;On Tuesday 19th of March we have a double bill:&lt;/p&gt;

&lt;ul&gt;
  &lt;li&gt;
    &lt;p&gt;Max Worgan will be showing a couple of bits of musical software he has been developing around strange attractors.&lt;/p&gt;
  &lt;/li&gt;
  &lt;li&gt;
    &lt;p&gt;Chris Kiefer and Dimitris Kyriakoudis will be talking/demoing the uSEQ live coding module, some recent developments with the firmware &amp;amp; hardware, and the collaboration they are doing with Music Thing Modular to port the uSEQ firmware to their upcoming hybrid and programmable module + a new editor.&lt;/p&gt;
  &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href=&quot;https://github.com/lnfiniteMonkeys/uSEQ&quot;&gt;https://github.com/lnfiniteMonkeys/uSEQ&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Date: Tuesday 19th of March
Time: 1pm
Location: Sussex Humanities Lab
Zoom: https://universityofsussex.zoom.us/j/99804772625&lt;/p&gt;

</description>
        <pubDate>Wed, 21 Feb 2024 00:00:00 +0000</pubDate>
        <link>http://www.emutelab.org/blog/lab-meeting-strange-attractors</link>
        <guid isPermaLink="true">http://www.emutelab.org/blog/lab-meeting-strange-attractors</guid>
        
        <category>meeting</category>
        
        <category>AI</category>
        
        <category>chaos</category>
        
        <category>attractors</category>
        
        <category>modular</category>
        
        <category>livecoding</category>
        
        
        <category>blog</category>
        
      </item>
    
      <item>
        <title>Google Summer of Code 2024 Project: Differentiable Logic Gate Networks for Real-Time Interaction</title>
        <description>&lt;p&gt;&lt;img src=&quot;/img/gsoc2024.png&quot; /&gt;&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;Jack Armitage (from Intelligent Instruments Lab, Iceland) is partnering up with Chris Kiefer to explore real-time embedded AI again, only this time based on the work of Stanford’s Felix Peterson on Differentiable Logic Gate Networks.&lt;/p&gt;

&lt;p&gt;If you’re interesting in applying for this project, read the full project description over at the BeagleBoard forum, and get in touch!&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://forum.beagleboard.org/t/embedded-differentiable-logic-gate-networks-for-real-time-interactive-and-creative-applications/37768&quot;&gt;https://forum.beagleboard.org/t/embedded-differentiable-logic-gate-networks-for-real-time-interactive-and-creative-applications/37768&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;More links:&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://summerofcode.withgoogle.com/&quot;&gt;https://summerofcode.withgoogle.com/&lt;/a&gt;&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;Timeline: &lt;a href=&quot;https://developers.google.com/open-source/gsoc/timeline&quot;&gt;https://developers.google.com/open-source/gsoc/timeline&lt;/a&gt;&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;BeagleBoard GSOC pages (including guides): &lt;a href=&quot;https://gsoc.beagleboard.org/&quot;&gt;https://gsoc.beagleboard.org/&lt;/a&gt;&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;BeagleBoard Discord (with #gsoc channel): &lt;a href=&quot;https://discord.gg/cNr5TSWZRz&quot;&gt;https://discord.gg/cNr5TSWZRz&lt;/a&gt;&lt;/p&gt;

</description>
        <pubDate>Wed, 21 Feb 2024 00:00:00 +0000</pubDate>
        <link>http://www.emutelab.org/blog/difflog-gsoc</link>
        <guid isPermaLink="true">http://www.emutelab.org/blog/difflog-gsoc</guid>
        
        <category>meeting</category>
        
        <category>gsoc</category>
        
        <category>AI</category>
        
        <category>differentiable logic</category>
        
        
        <category>blog</category>
        
      </item>
    
      <item>
        <title>Lab Meeting with Victor Shepardson</title>
        <description>&lt;p&gt;We are holding an Emute Lab Monthly meeting, roughly on the final Tuesday of each month.&lt;/p&gt;

&lt;p&gt;We are really excited to be able to host Victor Shepardson this month.&lt;/p&gt;

&lt;p&gt;Victor is a post-graduate research at the Intelligent Instrument Lab in Reykjavik (http://www.https://iil.is) where he is exploring a lived in approach to designing with AI for new musical instruments.  Victor has kindly agreed to talk us through the art of training RAVE Models (Realtime Audio Variational autoEncoder models https://github.com/acids-ircam/RAVE).  From there we hope to expand the discussion to how we might manipulate the resulting model in different ways.&lt;/p&gt;

&lt;p&gt;Date: Tuesday 27th February
Time: 1pm
Location: Sussex Humanities Lab
Zoom: This is an open list so please email me for the Zoom link if you wish to join.&lt;/p&gt;

</description>
        <pubDate>Wed, 21 Feb 2024 00:00:00 +0000</pubDate>
        <link>http://www.emutelab.org/blog/rave-lab-meeting</link>
        <guid isPermaLink="true">http://www.emutelab.org/blog/rave-lab-meeting</guid>
        
        <category>meeting</category>
        
        <category>AI</category>
        
        <category>rave</category>
        
        
        <category>blog</category>
        
      </item>
    
      <item>
        <title>Steve Symons wins Sussex AI Demo Award</title>
        <description>&lt;p&gt;Congratulations to Emute Lab PhD student Steve Symons, who won the £250 prize for best demo at the recent Sussex AI launch event.  Steve was showing his ‘entangled instrument’, which used a neural network to map a co-played instrument to different RAVE models.&lt;/p&gt;

&lt;p&gt;&lt;img src=&quot;/img/sussexAI1.jpeg&quot; /&gt;&lt;br /&gt;
&lt;img src=&quot;/img/sussexAI2.jpeg&quot; /&gt;&lt;br /&gt;
&lt;img src=&quot;/img/sussexAI3.jpeg&quot; /&gt;&lt;br /&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://www.sussex.ac.uk/research/centres/ai-research-group/&quot;&gt;https://www.sussex.ac.uk/research/centres/ai-research-group/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;http://entangled-instruments.xyz&quot;&gt;http://entangled-instruments.xyz&lt;/a&gt;&lt;/p&gt;
</description>
        <pubDate>Wed, 21 Feb 2024 00:00:00 +0000</pubDate>
        <link>http://www.emutelab.org/blog/demo-prize</link>
        <guid isPermaLink="true">http://www.emutelab.org/blog/demo-prize</guid>
        
        <category>sussex AI</category>
        
        <category>demo</category>
        
        
        <category>blog</category>
        
      </item>
    
      <item>
        <title>Emute Lab Research Fellow Post</title>
        <description>&lt;p&gt;There is a new Research Fellow post, working with Chris Kiefer on his new AHRC ‘Musically Embodied Machine Learning’ project (more info on this later…)&lt;/p&gt;

&lt;p&gt;The info for the post is here:&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://jobs.sussex.ac.uk/job/d9c1cb82-45cd-49f2-8642-0d8b49188e33&quot;&gt;https://jobs.sussex.ac.uk/job/d9c1cb82-45cd-49f2-8642-0d8b49188e33&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This position is for a post-doctoral researcher on the AHRC funded project Musically Embodied Machine Learning (MEML), within the Department of Music.  The project is an investigation into the musically expressive potential of machine learning when embodied within physical musical instruments.&lt;/p&gt;

&lt;p&gt;It proposes &lt;code class=&quot;language-plaintext highlighter-rouge&quot;&gt;tuneable ML&lt;/code&gt;, an approach to exploring the musicality of ML models, when they can be adjusted, personalised and remade, using a musical instrument as the interface. The project asks how instruments can be designed to make effective and musical use of embedded ML processes, and questions the implications for instrument designers and musicians when tunable ML processes are a fundamental driver of an instrument’s musical feel and musical behaviour.&lt;/p&gt;

&lt;p&gt;The role will encompass co-designing new musical instruments with practicing musicians, and evaluating the instruments through their experiences.   The project will offer opportunities for developing skills in creative artificial intelligence, musical instrument design (hardware and software) and participatory research methods.  There will be opportunities for publication, industry collaboration, conference and trade show attendance, and development and release of open source software and hardware.&lt;/p&gt;

&lt;p&gt;The post is for 18 months, 0.5fte, and the work will be conducted on campus in Brighton. The candidate will become a member of the Experimental Music Technologies Lab, which represents a diverse group of researchers and practicing musicians at University of Sussex.&lt;/p&gt;

&lt;p&gt;Please contact  Dr Chris Kiefer, c.kiefer@sussex.ac.uk for informal enquiries.&lt;/p&gt;

</description>
        <pubDate>Wed, 21 Feb 2024 00:00:00 +0000</pubDate>
        <link>http://www.emutelab.org/blog/MEML-PDRA</link>
        <guid isPermaLink="true">http://www.emutelab.org/blog/MEML-PDRA</guid>
        
        <category>MEML</category>
        
        <category>jobs</category>
        
        <category>research fellow</category>
        
        
        <category>blog</category>
        
      </item>
    
  </channel>
</rss>
