<schedule>
<version>187</version>
<conference>
<acronym>2024</acronym>
<title>Snap!shot 2024</title>
<start>2024-07-31</start>
<end>2024-08-01</end>
<days>2</days>
<timeslot_duration>00:01</timeslot_duration>
</conference>
<day date='2024-07-31' index='1'>
<room name='Online Room 1'>
<event guid='38enVVNhVzgTwq7TU_jPOg' id='741'>
<date>2024-07-31T16:30:00+02:00</date>
<start>14:30</start>
<duration>00:10</duration>
<room>Online Room 1</room>
<type>Plenary Session</type>
<language></language>
<slug>741-welcome-to-snap-shot</slug>
<title>Welcome to Snap!shot</title>
<subtitle></subtitle>
<track></track>
<abstract>Welcome to Snap!shot</abstract>
<description>Welcome to Snap!shot</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='312'>Jadga Hügle</person>
</persons>
</event>
<event guid='M4rVC2_ynZWIsdC8eGmzlQ' id='726'>
<date>2024-07-31T16:40:00+02:00</date>
<start>14:40</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>726-snapal-a-llm-here-to-help-you-with-snap-s-documentation</slug>
<title>SnaPal: a LLM here to help you with Snap!&#39;s documentation</title>
<subtitle></subtitle>
<track></track>
<abstract>In this lightning talk, we will speak on how we are utilizing Snap!&#39;s ~400 page documentation, along with RAG (Retrieval-Augmented Generation) to create a LLM that pulls from the documentation in order to help answer common questions. We first have the Large Language Model process Snap!’s documentation and create an index, breaking down the 400 pages into a smaller, more searchable format. The Large Language Model, when prompted, will then search and utilize the most relevant chunk of text/images. This will be useful because it will remove learning friction and allow beginners access to tools they might not be aware of, as well as improving accessibility and is accurate and contextually relevant (due to it referencing the already created documentation).
</abstract>
<description>In this lightning talk, we will speak on how we are utilizing Snap!&#39;s ~400 page documentation, along with RAG (Retrieval-Augmented Generation) to create a LLM that pulls from the documentation in order to help answer common questions. We first have the Large Language Model process Snap!’s documentation and create an index, breaking down the 400 pages into a smaller, more searchable format. The Large Language Model, when prompted, will then search and utilize the most relevant chunk of text/images. This will be useful because it will remove learning friction and allow beginners access to tools they might not be aware of, as well as improving accessibility and is accurate and contextually relevant (due to it referencing the already created documentation).
</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='1044'>Yuan Garcia</person>
</persons>
</event>
<event guid='jE-aPkNBCOx8cV2SSGWM9A' id='737'>
<date>2024-07-31T16:40:00+02:00</date>
<start>14:40</start>
<duration>00:10</duration>
<room>Online Room 1</room>
<type>Lightning Talks Discussion</type>
<language></language>
<slug>737-lightning-talks-1</slug>
<title>Lightning Talks 1</title>
<subtitle></subtitle>
<track></track>
<abstract>Five 7 minute talks - get enlightened :)</abstract>
<description>Five 7 minute talks - get enlightened :)</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
</persons>
</event>
<event guid='Qo10MULvIACOpwX3b9u18A' id='718'>
<date>2024-07-31T16:47:00+02:00</date>
<start>14:47</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>718-sneak-preview-of-the-new-turtlestitch</slug>
<title>Sneak Preview of the New Turtlestitch</title>
<subtitle>A New Cloud UI and Project Content Segment</subtitle>
<track></track>
<abstract>The [TurtleStitch](http://www.turtlestitch.org) journey began with its debut at the Scratch Conference in 2015, introducing its unique concept for creative coding with embroidery machines. Over the years, the SNAP!-based TurtleStitch has evolved, comprising two primary components: the coding interface and the user/project cloud. While these components have received regular updates, we recognized the need for a major enhancement and an additional offering for our user base.

In this talk, we will present a sneak preview of the new TurtleStitch cloud interface and introduce an exciting new component: a content segment featuring detailed project descriptions, akin to Instructables for TurtleStitch projects. Additionally, we will delve into our motivations behind these updates and share our vision for the future of TurtleStitch.</abstract>
<description>The [TurtleStitch](http://www.turtlestitch.org) journey began with its debut at the Scratch Conference in 2015, introducing its unique concept for creative coding with embroidery machines. Over the years, the SNAP!-based TurtleStitch has evolved, comprising two primary components: the coding interface and the user/project cloud. While these components have received regular updates, we recognized the need for a major enhancement and an additional offering for our user base.

In this talk, we will present a sneak preview of the new TurtleStitch cloud interface and introduce an exciting new component: a content segment featuring detailed project descriptions, akin to Instructables for TurtleStitch projects. Additionally, we will delve into our motivations behind these updates and share our vision for the future of TurtleStitch.</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='93'>Andrea Mayr-Stalder</person>
</persons>
</event>
<event guid='ITGpo-1Q4NqM0x9iN-LGaw' id='712'>
<date>2024-07-31T16:54:00+02:00</date>
<start>14:54</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>712-from-pattern-to-code</slug>
<title>From Pattern to Code </title>
<subtitle>Video Tutorial Resources for TurtleStitch.</subtitle>
<track></track>
<abstract>**From Pattern to Code** is a series of videos that show how to develop code to create patterns in TurtleStitch. Using TurtleStitch means that patterns can then be stitched onto cloth using an embroidery machine enabling the creation of tangible objects.  

These resources take a different approach to learning to code. Instead of focusing on key concepts in computing, or introducing commands and their function and building bottom-up from there, these resources present a range of images showing patterns, and exploring top-down from there how they can be created from code. Clicking on the image gives a short 2 minute video that explains how the pattern can be created using code. 

The idea is to give people a reason to want to learn to program, and aims to inspire them to create their own patterns. Meanwhile it introduces them to useful concepts in computing and to become familiar with a range of commands. It gives valuable information about the context where its appropriate to use specific commands and shows how to apply them.

Videos are used because of the flexibility in use they offer. A video can be watched to get an idea about how to create the pattern, they can be paused, slowed down or restarted making it easy to follow key steps. The short duration of the videos is to minimise the time invested in deciding whether to create the pattern. 

[www.warwick.ac.uk/turtlestitch/patterntocode](http://www.warwick.ac.uk/turtlestitch/patterntocode)</abstract>
<description>**From Pattern to Code** is a series of videos that show how to develop code to create patterns in TurtleStitch. Using TurtleStitch means that patterns can then be stitched onto cloth using an embroidery machine enabling the creation of tangible objects.  

These resources take a different approach to learning to code. Instead of focusing on key concepts in computing, or introducing commands and their function and building bottom-up from there, these resources present a range of images showing patterns, and exploring top-down from there how they can be created from code. Clicking on the image gives a short 2 minute video that explains how the pattern can be created using code. 

The idea is to give people a reason to want to learn to program, and aims to inspire them to create their own patterns. Meanwhile it introduces them to useful concepts in computing and to become familiar with a range of commands. It gives valuable information about the context where its appropriate to use specific commands and shows how to apply them.

Videos are used because of the flexibility in use they offer. A video can be watched to get an idea about how to create the pattern, they can be paused, slowed down or restarted making it easy to follow key steps. The short duration of the videos is to minimise the time invested in deciding whether to create the pattern. 

[www.warwick.ac.uk/turtlestitch/patterntocode](http://www.warwick.ac.uk/turtlestitch/patterntocode)</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='391'>Margaret Low</person>
</persons>
</event>
<event guid='hd8eI3ke_BI0v5sG32MtPg' id='732'>
<date>2024-07-31T17:01:00+02:00</date>
<start>15:01</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>732-using-turtlestitch-to-teach-mathematics-and-computer-science</slug>
<title>Using Turtlestitch to Teach Mathematics and Computer Science</title>
<subtitle>an experience report from South Africa</subtitle>
<track></track>
<abstract>South African Shweshwe Fabric contains beautiful geometric patterns that are ideal as a basis for different Turtlestitch designs. In our project we used these designs to teach learners from different township schools in Cape Town programming and geometry. In our talk we want to share our experiences from our initial events and our plan on how to extend the project in to future with peer learning and a curriculum. </abstract>
<description>South African Shweshwe Fabric contains beautiful geometric patterns that are ideal as a basis for different Turtlestitch designs. In our project we used these designs to teach learners from different township schools in Cape Town programming and geometry. In our talk we want to share our experiences from our initial events and our plan on how to extend the project in to future with peer learning and a curriculum. </description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='312'>Jadga Hügle</person>
</persons>
</event>
<event guid='-sCHie0WXn6_t6PS8fxnsA' id='724'>
<date>2024-07-31T17:08:00+02:00</date>
<start>15:08</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>724-extending-extensions</slug>
<title>Extending Extensions</title>
<subtitle>Exploring Snap!&#39;s Library and Extension Mechanisms to Create Custom Extensions</subtitle>
<track></track>
<abstract>The extension and library mechanisms of Snap! offer significant opportunities to enhance and customize the user experience. In this talk, I will explore the power and flexibility of these mechanisms by recreating the BloP extension using only the &quot;primitive&quot; block, without modifying the software&#39;s source code. The BloP extension introduces advanced features for managing the visibility of block categories, blocks, sprites, and initialization scripts. This allows users to create “safe” IDEs for their custom block-based programming languages, ensuring they do not accidentally impair their language by removing vital sprites and scripts or changing the definition of language blocks. For this purpose, the “ide_” group of JavaScript functions within the “primitive” block will be primarily used.

The &quot;primitive&quot; block is designed to provide an effective method for creating extensions that can deeply interact with the Snap! environment without compromising the software&#39;s stability. I will demonstrate how the available primitives can replicate most of BloP’s functionalities and what enhancements are needed to allow new environments for block-based programming languages to be easily and safely recreated within Snap!.

The final extension does not improve upon the features already offered by previous versions of the BloP extension, but it provides significant insights into the design of the Snap! extension mechanism and, more importantly, it can be used inside the online version of the Snap! IDE.</abstract>
<description>The extension and library mechanisms of Snap! offer significant opportunities to enhance and customize the user experience. In this talk, I will explore the power and flexibility of these mechanisms by recreating the BloP extension using only the &quot;primitive&quot; block, without modifying the software&#39;s source code. The BloP extension introduces advanced features for managing the visibility of block categories, blocks, sprites, and initialization scripts. This allows users to create “safe” IDEs for their custom block-based programming languages, ensuring they do not accidentally impair their language by removing vital sprites and scripts or changing the definition of language blocks. For this purpose, the “ide_” group of JavaScript functions within the “primitive” block will be primarily used.

The &quot;primitive&quot; block is designed to provide an effective method for creating extensions that can deeply interact with the Snap! environment without compromising the software&#39;s stability. I will demonstrate how the available primitives can replicate most of BloP’s functionalities and what enhancements are needed to allow new environments for block-based programming languages to be easily and safely recreated within Snap!.

The final extension does not improve upon the features already offered by previous versions of the BloP extension, but it provides significant insights into the design of the Snap! extension mechanism and, more importantly, it can be used inside the online version of the Snap! IDE.</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='234'>Stefano Federici</person>
</persons>
</event>
<event guid='2jiEXM-tmzgKOPY0cbdSIg' id='735'>
<date>2024-07-31T17:25:00+02:00</date>
<start>15:25</start>
<duration>00:25</duration>
<room>Online Room 1</room>
<type>Break</type>
<language></language>
<slug>735-break-1</slug>
<title>Break 1</title>
<subtitle></subtitle>
<track></track>
<abstract>Enjoy a coffee or grab some food, we&#39;ll see you soon :)</abstract>
<description>Enjoy a coffee or grab some food, we&#39;ll see you soon :)</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
</persons>
</event>
<event guid='9IDHbmAy2IOYJ-MI5b7snw' id='727'>
<date>2024-07-31T17:50:00+02:00</date>
<start>15:50</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>727-integrating-snap-into-prairielearn</slug>
<title>Integrating Snap! into PrairieLearn</title>
<subtitle>Enhancing Testing and Assignment Delivery</subtitle>
<track></track>
<abstract>
In this presentation, we explore the integration of Snap!, into PrairieLearn, a robust testing and assignment delivery platform. Building upon last year&#39;s advancements in Snap! autograders integrated with Gradescope, this session introduces a novel approach where students can seamlessly program within the PrairieLearn environment and submit their assignments without leaving the interface. This integration not only streamlines the student experience but also enhances instructor capabilities in assessing programming assignments.

Attendees will learn about the technical architecture behind this integration, including the challenges encountered and solutions devised to ensure a smooth user experience. We will discuss how the benefits of integrating Snap! with PrairieLearn expands the platform&#39;s utility beyond traditional coding environments. Practical examples and demonstrations will showcase the functionalities and efficiencies gained through this integration, illustrating its potential impact on programming education. We will also discuss the results of the student experience research studies we conducted in assessing the ease and effectiveness of using Snap! in PrairieLearn</abstract>
<description>
In this presentation, we explore the integration of Snap!, into PrairieLearn, a robust testing and assignment delivery platform. Building upon last year&#39;s advancements in Snap! autograders integrated with Gradescope, this session introduces a novel approach where students can seamlessly program within the PrairieLearn environment and submit their assignments without leaving the interface. This integration not only streamlines the student experience but also enhances instructor capabilities in assessing programming assignments.

Attendees will learn about the technical architecture behind this integration, including the challenges encountered and solutions devised to ensure a smooth user experience. We will discuss how the benefits of integrating Snap! with PrairieLearn expands the platform&#39;s utility beyond traditional coding environments. Practical examples and demonstrations will showcase the functionalities and efficiencies gained through this integration, illustrating its potential impact on programming education. We will also discuss the results of the student experience research studies we conducted in assessing the ease and effectiveness of using Snap! in PrairieLearn</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='2143'>Victoria Phelps</person>
</persons>
</event>
<event guid='YxRDgENbpLW_WspdrefnFQ' id='715'>
<date>2024-07-31T17:57:00+02:00</date>
<start>15:57</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>715-smart-toys</slug>
<title>Smart Toys</title>
<subtitle>Snap! Programmable Playground Toys</subtitle>
<track></track>
<abstract>**Concept Overview**:
The [Smart Toys](https://raw.githubusercontent.com/pixavier/meta4snap/main/st/video.mp4) playground is designed around programmable devices that form the core of an engaging and educational game. 	


![Smart Toys](https://raw.githubusercontent.com/pixavier/microworlds/main/mws/snapshot2024/smarttoys.png &quot;Smart Toys&quot;)


**The Game**: Chase the Colored Light
At the heart of this playground lies a simple yet captivating game called &#39;Chase the Colored Light.&#39; Here’s how it works:

- A colored light appears at random locations within the playground.

- Children must run and touch the sensor that matches the color of the light.

- This game enhances quick thinking, color recognition, and physical agility as children chase and match the colors.

**Technological Integration**:
The magic behind the [Smart Toys](https://provisional.binefa.com/index.php/Smart_Toys) playground comes from the IoT Vertebrae open-source hardware project.

- Programmable Logic with Snap!:
   - The game logic for &#39;Chase the Colored Light&#39; can be programmed using Snap!
   - This allows educators and kids to modify and create new game rules, fostering a creative and evolving play environment.

- Hardware - The [IoT Vertebrae](https://provisional.binefa.com/index.php/IoT-Vertebrae):
   - Each toy is equipped with an IoT Vertebrae ([presented](https://www.youtube.com/watch?v=hoDaJh98o3k) at [Snap!Con2023](https://www.snapcon.org/conferences/2023/program/proposals/627)), a robust hardware platform tailored for Internet of Things (IoT) applications.
   - The ESP32 and Raspberry Pi based IoT Vertebrae ensures reliable connectivity and interaction among the toys in the playground via the MQTT protocol.

- [Digital Twin](https://xavierpi.com/st) based [microworld](https://xavierpi.com/tp) in Snap!:
   - To enhance the development and testing of new game rules, a digital twin based microworld of the playground has been created in Snap!.
   - This microworld allows experimentation with powerful game logic ideas without disrupting the physical playground, enabling continuous innovation and refinement.
</abstract>
<description>**Concept Overview**:
The [Smart Toys](https://raw.githubusercontent.com/pixavier/meta4snap/main/st/video.mp4) playground is designed around programmable devices that form the core of an engaging and educational game. 	


![Smart Toys](https://raw.githubusercontent.com/pixavier/microworlds/main/mws/snapshot2024/smarttoys.png &quot;Smart Toys&quot;)


**The Game**: Chase the Colored Light
At the heart of this playground lies a simple yet captivating game called &#39;Chase the Colored Light.&#39; Here’s how it works:

- A colored light appears at random locations within the playground.

- Children must run and touch the sensor that matches the color of the light.

- This game enhances quick thinking, color recognition, and physical agility as children chase and match the colors.

**Technological Integration**:
The magic behind the [Smart Toys](https://provisional.binefa.com/index.php/Smart_Toys) playground comes from the IoT Vertebrae open-source hardware project.

- Programmable Logic with Snap!:
   - The game logic for &#39;Chase the Colored Light&#39; can be programmed using Snap!
   - This allows educators and kids to modify and create new game rules, fostering a creative and evolving play environment.

- Hardware - The [IoT Vertebrae](https://provisional.binefa.com/index.php/IoT-Vertebrae):
   - Each toy is equipped with an IoT Vertebrae ([presented](https://www.youtube.com/watch?v=hoDaJh98o3k) at [Snap!Con2023](https://www.snapcon.org/conferences/2023/program/proposals/627)), a robust hardware platform tailored for Internet of Things (IoT) applications.
   - The ESP32 and Raspberry Pi based IoT Vertebrae ensures reliable connectivity and interaction among the toys in the playground via the MQTT protocol.

- [Digital Twin](https://xavierpi.com/st) based [microworld](https://xavierpi.com/tp) in Snap!:
   - To enhance the development and testing of new game rules, a digital twin based microworld of the playground has been created in Snap!.
   - This microworld allows experimentation with powerful game logic ideas without disrupting the physical playground, enabling continuous innovation and refinement.
</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='218'>Xavier Pi</person>
<person id='613'>Jordi Binefa</person>
</persons>
</event>
<event guid='hNuDwjfDJNmhgSaB9TwT3w' id='729'>
<date>2024-07-31T18:04:00+02:00</date>
<start>16:04</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>729-quantum-computing-and-snap</slug>
<title>Quantum Computing and Snap!</title>
<subtitle></subtitle>
<track></track>
<abstract>The second quantum revolution is upon us with billions of dollars being invested in quantum computing technologies by initiatives around the world. How will students be introduced to these technologies and their various potential applications across sectors? When introducing Snap! to physicists for feedback on how quantum computing might be incorporated, the graphical language of ZX-Calculus was proposed as a natural fit for the interface. Join for a brief introductory primer and preliminary thoughts on how these diagrams might be included in Snap!.</abstract>
<description>The second quantum revolution is upon us with billions of dollars being invested in quantum computing technologies by initiatives around the world. How will students be introduced to these technologies and their various potential applications across sectors? When introducing Snap! to physicists for feedback on how quantum computing might be incorporated, the graphical language of ZX-Calculus was proposed as a natural fit for the interface. Join for a brief introductory primer and preliminary thoughts on how these diagrams might be included in Snap!.</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='287'>Mary Fries</person>
</persons>
</event>
<event guid='k-uEk-9TPrKVo7unOJ9PrQ' id='730'>
<date>2024-07-31T18:11:00+02:00</date>
<start>16:11</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>730-the-art-of-paper-engineering-with-turtlestitch</slug>
<title>The Art Of Paper Engineering With TurtleStitch </title>
<subtitle></subtitle>
<track></track>
<abstract>While TurtleStitch is known primarily for generating embroidery patterns, it can also be adapted for paper engineering projects.  By harnessing the power of programming in TurtleStitch and cutting the design with an electronic paper cutting machine such as a Silhouette Cameo, a bridge from the digital to physical world can be created. Thereby, unlocking new possibilities of customizable paper designs.  It is an educational tool which can teach geometry, engineering principles and spatial awareness.

In this talk, I will explain how I created a paper microscope using the art of paper engineering with TurtleStitch, https://turtlestitch.blogspot.com/2024/06/a-paper-microscope-coded-in-turtlestitch.html



</abstract>
<description>While TurtleStitch is known primarily for generating embroidery patterns, it can also be adapted for paper engineering projects.  By harnessing the power of programming in TurtleStitch and cutting the design with an electronic paper cutting machine such as a Silhouette Cameo, a bridge from the digital to physical world can be created. Thereby, unlocking new possibilities of customizable paper designs.  It is an educational tool which can teach geometry, engineering principles and spatial awareness.

In this talk, I will explain how I created a paper microscope using the art of paper engineering with TurtleStitch, https://turtlestitch.blogspot.com/2024/06/a-paper-microscope-coded-in-turtlestitch.html



</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='3285'>Elaine Wolfe</person>
</persons>
</event>
<event guid='9eJVQ3HTl9KOXzADyUVFlA' id='709'>
<date>2024-07-31T18:18:00+02:00</date>
<start>16:18</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>709-building-3d-worlds-in-a-snap</slug>
<title>Building 3D Worlds in a Snap!</title>
<subtitle>Scenario creation in RoboScape Online</subtitle>
<track></track>
<abstract>At a previous Snap_!_con, we demonstrated a robotics simulation that runs in the NetsBlox/Snap_!_ interface. However, the environments the robots interact with were created in C# code, only the robot controller code was in Snap_!_. This made it infeasible for a student to contribute a new activity to the program. 

However, our newest version has every environment built as a NetsBlox project, even when running on our server. These projects have access to (almost) all the usual NetsBlox/Snap! features, allowing for more diverse and more complex projects, while also making it easier to work with. </abstract>
<description>At a previous Snap_!_con, we demonstrated a robotics simulation that runs in the NetsBlox/Snap_!_ interface. However, the environments the robots interact with were created in C# code, only the robot controller code was in Snap_!_. This made it infeasible for a student to contribute a new activity to the program. 

However, our newest version has every environment built as a NetsBlox project, even when running on our server. These projects have access to (almost) all the usual NetsBlox/Snap! features, allowing for more diverse and more complex projects, while also making it easier to work with. </description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='2020'>Gordon Stein</person>
</persons>
</event>
<event guid='OMNYhxmZQzlwe28t_4R9aA' id='738'>
<date>2024-07-31T18:25:00+02:00</date>
<start>16:25</start>
<duration>00:10</duration>
<room>Online Room 1</room>
<type>Lightning Talks Discussion</type>
<language></language>
<slug>738-lightning-talks-2</slug>
<title>Lightning Talks 2</title>
<subtitle></subtitle>
<track></track>
<abstract>Five 7 minute talks - get enlightened :)</abstract>
<description>Five 7 minute talks - get enlightened :)</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
</persons>
</event>
<event guid='AHmDfEiQY8XGy6UpoaP7CQ' id='711'>
<date>2024-07-31T18:40:00+02:00</date>
<start>16:40</start>
<duration>00:03</duration>
<room>Online Room 1</room>
<type>Show Your Project</type>
<language></language>
<slug>711-mood-classifier-with-arblox</slug>
<title>Mood classifier with ARBlox</title>
<subtitle>A Simple Nearest Neighbors Project </subtitle>
<track></track>
<abstract>The project uses ARBlox to classify the mood of the user. the user is able to collect data samples of custom expressions. Once data is collected, the user can infer their facial expression in real-time. The speed of the algorithm enables its use as a part of various expression driven games. </abstract>
<description>The project uses ARBlox to classify the mood of the user. the user is able to collect data samples of custom expressions. Once data is collected, the user can infer their facial expression in real-time. The speed of the algorithm enables its use as a part of various expression driven games. </description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='4916'>Saman Kittani</person>
</persons>
</event>
<event guid='tXOvCQS1jsGdSOKCcvVcyQ' id='716'>
<date>2024-07-31T18:43:00+02:00</date>
<start>16:43</start>
<duration>00:03</duration>
<room>Online Room 1</room>
<type>Show Your Project</type>
<language></language>
<slug>716-tackling-the-turing-test</slug>
<title>Tackling the Turing Test</title>
<subtitle></subtitle>
<track></track>
<abstract>In the last few years, we have seen the advent of practical natural language processing in the form of various generative tools such as ChatGPT, Gemini, and many others. In fact, these tools are so powerful at understanding our requests and generating appropriate responses that many people have begun to ascribe these chatbots sentience and claim they mark the near-future of artificial general intelligence. Well, to get the bottom of this, let’s put them to the test! Using NetsBlox, a fork of Snap! that adds various networking features such as message passing over the internet, I have created a chatroom project where users can connect and be randomly matched either with another human or an instance of ChatGPT. Users can then engage in a back-and-forth turn-based dialog and see if they can tell the difference between man and machine.</abstract>
<description>In the last few years, we have seen the advent of practical natural language processing in the form of various generative tools such as ChatGPT, Gemini, and many others. In fact, these tools are so powerful at understanding our requests and generating appropriate responses that many people have begun to ascribe these chatbots sentience and claim they mark the near-future of artificial general intelligence. Well, to get the bottom of this, let’s put them to the test! Using NetsBlox, a fork of Snap! that adds various networking features such as message passing over the internet, I have created a chatroom project where users can connect and be randomly matched either with another human or an instance of ChatGPT. Users can then engage in a back-and-forth turn-based dialog and see if they can tell the difference between man and machine.</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='1014'>Devin Jean</person>
</persons>
</event>
<event guid='Ghn_Zbo0Rkk3gSVNO-4svw' id='719'>
<date>2024-07-31T18:46:00+02:00</date>
<start>16:46</start>
<duration>00:03</duration>
<room>Online Room 1</room>
<type>Show Your Project</type>
<language></language>
<slug>719-play-2-player-tic-tac-toe-perfectly-with-snap-gamesman-and-a-webcam</slug>
<title>Play 2-player Tic-Tac-Toe Perfectly with Snap!, GAMESMAN, and a Webcam</title>
<subtitle></subtitle>
<track></track>
<abstract>The GAMESMAN system is a piece of software 35 years in the making -- it solves board games (2-person abstract strategy games of no chance), builds a database of the value (win, tie, or lose) for every position, and provides users with a GUI to play and analyze them. We recently opened up the API to allow any external client to access our database. At Snap!Con 2023 I demoed a Snap! program that played one-player Tic-Tac-Toe perfectly (the computer makes the moves for the second player) using our back-end as the &quot;brains&quot;. The user specified their moves by clicking on a slot on the 3x3 board with their mouse.

For this year&#39;s Snap!Shot 2024 I added the ability to play 2-player Tic-Tac-Toe on a physical board, with a webcam recording the moves and Snap! displaying the value of the moves [from a stoplight: red=losing, yellow=tying, green=winning] for the player whose turn it is. You need blue and red pieces (poker chips work great), and blue (X) goes first.

- Demo: https://youtu.be/iV5MFALQObc 
- Snap! project: https://snap.berkeley.edu/snap/snap.html#present:Username=dan%20garcia&amp;ProjectName=Gamesman%20AR%20for%20TicTacToe
- Tic-Tac-Toe board: https://www.dropbox.com/scl/fi/mwvdpbbvvqgjlz81ng3cn/Tic-Tac-Toe-AR.pdf?rlkey=520myvv8e9s334sejukk6ddv5&amp;dl=0</abstract>
<description>The GAMESMAN system is a piece of software 35 years in the making -- it solves board games (2-person abstract strategy games of no chance), builds a database of the value (win, tie, or lose) for every position, and provides users with a GUI to play and analyze them. We recently opened up the API to allow any external client to access our database. At Snap!Con 2023 I demoed a Snap! program that played one-player Tic-Tac-Toe perfectly (the computer makes the moves for the second player) using our back-end as the &quot;brains&quot;. The user specified their moves by clicking on a slot on the 3x3 board with their mouse.

For this year&#39;s Snap!Shot 2024 I added the ability to play 2-player Tic-Tac-Toe on a physical board, with a webcam recording the moves and Snap! displaying the value of the moves [from a stoplight: red=losing, yellow=tying, green=winning] for the player whose turn it is. You need blue and red pieces (poker chips work great), and blue (X) goes first.

- Demo: https://youtu.be/iV5MFALQObc 
- Snap! project: https://snap.berkeley.edu/snap/snap.html#present:Username=dan%20garcia&amp;ProjectName=Gamesman%20AR%20for%20TicTacToe
- Tic-Tac-Toe board: https://www.dropbox.com/scl/fi/mwvdpbbvvqgjlz81ng3cn/Tic-Tac-Toe-AR.pdf?rlkey=520myvv8e9s334sejukk6ddv5&amp;dl=0</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='317'>Dan Garcia</person>
</persons>
</event>
<event guid='iZxX62CcupphcDD0RHaTdg' id='728'>
<date>2024-07-31T18:49:00+02:00</date>
<start>16:49</start>
<duration>00:03</duration>
<room>Online Room 1</room>
<type>Show Your Project</type>
<language></language>
<slug>728-pokewan</slug>
<title>Pokewan!</title>
<subtitle></subtitle>
<track></track>
<abstract>For my Computer Science Principles class, we were required to make a project in Snap. It could be anything we wanted, and I decided to make a top-down, Pokemon-inspired game in an 8-bit style. The player was able to explore a vast map, talking to characters and collecting 5 trinkets along the way. I spent a lot of time and effort into perfecting the code and drawing the sprites myself, and I hope you enjoy!
https://snap.berkeley.edu/snap/snap.html#present:Username=taliayy&amp;ProjectName=Pokewan</abstract>
<description>For my Computer Science Principles class, we were required to make a project in Snap. It could be anything we wanted, and I decided to make a top-down, Pokemon-inspired game in an 8-bit style. The player was able to explore a vast map, talking to characters and collecting 5 trinkets along the way. I spent a lot of time and effort into perfecting the code and drawing the sprites myself, and I hope you enjoy!
https://snap.berkeley.edu/snap/snap.html#present:Username=taliayy&amp;ProjectName=Pokewan</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='597'>Talia Ye</person>
</persons>
</event>
<event guid='GNRzN-N0I02qFXNjiQ89Pw' id='705'>
<date>2024-07-31T18:52:00+02:00</date>
<start>16:52</start>
<duration>00:03</duration>
<room>Online Room 1</room>
<type>Show Your Project</type>
<language></language>
<slug>705-snap-hack</slug>
<title>Snap!Hack</title>
<subtitle>Implementing the classic roguelike Hack in Snap!</subtitle>
<track></track>
<abstract>### Reimagining Hack using Snap!
Jay Fenlason, one of Brian Harvey&#39;s CS students at Lincoln-Sudbury Regional High School, was inspired to create an implementation of the computer game _Rogue_. Jay would call his game _Hack_ and it would become the basis for _NetHack_, a roguelike game that is still being updated and played today. There&#39;s even a later version of _Hack_ available as part of the BSD Games package for Linux and FreeBSD machines created by Andres Brouwer after Jay ceased working on the original _Hack_.

While the original _Hack_ sadly appears to be lost to bit rot, I thought it might be an interesting homage to Brian&#39;s work on _Snap!_ and Jay&#39;s seminal work on _Hack_ to implement an analogue to the original in _Snap!_ that I am calling _Snap!Hack_.

This demo will briefly display game play and include a discussion of the design decisions made in order to recreate _Hack_ on a modern computer in _Snap!_ Of course, you are welcome to play and remix the game yourself.

[View project on Snap!](https://snap.berkeley.edu/snap/snap.html#present:Username=dainialpadraig&amp;ProjectName=Snap!Hack)

[View project on GitHub](https://github.com/Sustainable-Games/snaphack)</abstract>
<description>### Reimagining Hack using Snap!
Jay Fenlason, one of Brian Harvey&#39;s CS students at Lincoln-Sudbury Regional High School, was inspired to create an implementation of the computer game _Rogue_. Jay would call his game _Hack_ and it would become the basis for _NetHack_, a roguelike game that is still being updated and played today. There&#39;s even a later version of _Hack_ available as part of the BSD Games package for Linux and FreeBSD machines created by Andres Brouwer after Jay ceased working on the original _Hack_.

While the original _Hack_ sadly appears to be lost to bit rot, I thought it might be an interesting homage to Brian&#39;s work on _Snap!_ and Jay&#39;s seminal work on _Hack_ to implement an analogue to the original in _Snap!_ that I am calling _Snap!Hack_.

This demo will briefly display game play and include a discussion of the design decisions made in order to recreate _Hack_ on a modern computer in _Snap!_ Of course, you are welcome to play and remix the game yourself.

[View project on Snap!](https://snap.berkeley.edu/snap/snap.html#present:Username=dainialpadraig&amp;ProjectName=Snap!Hack)

[View project on GitHub](https://github.com/Sustainable-Games/snaphack)</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='4837'>Dan Stormont</person>
</persons>
</event>
<event guid='uzSjSw64CdXjAOpg4iS_7w' id='740'>
<date>2024-07-31T18:55:00+02:00</date>
<start>16:55</start>
<duration>00:30</duration>
<room>Online Room 1</room>
<type>Show Your Project Free Time</type>
<language></language>
<slug>740-show-your-project-session</slug>
<title>Show Your Project Session</title>
<subtitle></subtitle>
<track></track>
<abstract>3 minutes, no slides 💥💥💥</abstract>
<description>3 minutes, no slides 💥💥💥</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
</persons>
</event>
<event guid='CgMZI6rQ_DNv5t5YmkjyQQ' id='736'>
<date>2024-07-31T19:25:00+02:00</date>
<start>17:25</start>
<duration>00:25</duration>
<room>Online Room 1</room>
<type>Break</type>
<language></language>
<slug>736-break-2</slug>
<title>Break 2</title>
<subtitle></subtitle>
<track></track>
<abstract>Enjoy a coffee or grab some food, we&#39;ll see you soon :)</abstract>
<description>Enjoy a coffee or grab some food, we&#39;ll see you soon :)</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
</persons>
</event>
<event guid='oT7lID4od1zGNRUmfBt3ww' id='725'>
<date>2024-07-31T19:50:00+02:00</date>
<start>17:50</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>725-embedding-big-ideas-in-projects</slug>
<title>Embedding big ideas in projects</title>
<subtitle></subtitle>
<track></track>
<abstract>Pedagogy and engagement need to go hand in hand.  The main questions I ask when arranging a project are:

* Is it fun?  If not, can&#39;t I do better for my students?
* Does it embed big ideas naturally so the learning doesn&#39;t feel forced?
* Does it leverage the best features of the programming language?
* Is it accessible to everyone in the room?

I offer three projects that I believe address all three of these questions:

* Analog clock (anchors, number sense, angles, synchronized movements)
* Breakout (clones and game play)
* Fractal tree (pretty graphics and math regarding what it means for a problem to be solvable)

All of these have been hits with my students and I think they are worth thinking about carefully.</abstract>
<description>Pedagogy and engagement need to go hand in hand.  The main questions I ask when arranging a project are:

* Is it fun?  If not, can&#39;t I do better for my students?
* Does it embed big ideas naturally so the learning doesn&#39;t feel forced?
* Does it leverage the best features of the programming language?
* Is it accessible to everyone in the room?

I offer three projects that I believe address all three of these questions:

* Analog clock (anchors, number sense, angles, synchronized movements)
* Breakout (clones and game play)
* Fractal tree (pretty graphics and math regarding what it means for a problem to be solvable)

All of these have been hits with my students and I think they are worth thinking about carefully.</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='44'>Josh Paley</person>
</persons>
</event>
<event guid='JCAVomN2s8lrt-lrKUSeQg' id='704'>
<date>2024-07-31T19:57:00+02:00</date>
<start>17:57</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>704-what-turtles-can-and-can-t-teach</slug>
<title>What Turtles Can (and Can&#39;t) Teach </title>
<subtitle>A Brief Examination of Drawing in 2D</subtitle>
<track></track>
<abstract>Turtlestitch is great for small scale embroidery. But can it be used for other fiber arts domains like quilting? In pursuit of an answer I&#39;ve explored turtle geometry, cartesian solutions and most recently event driven data capture. This lightening talk will demonstrate where turtles prevail as both a pedagogical and functionally useful tool. But it will also suggest that bridges to task often called &#39;advanced&#39; in both fiber art and computer programming may be problematic with turtles. This quick presentation will provide a half dozen small projects: (1) pure turtle, (2) pure cartesian, (3) pure data capture, (4) turtles with cartesian, (5) turtles and data capture, (6) cartesian and data capture.  Of course there will be commentary on how keeping an open mind about purpose and pedagogy allows the student to define their own goals while learning powerful ideas in fiber arts, computer science and math. Full turtlestitch projects used in this presentation will be labeled as &quot;SnapShotNN&quot; at https://turtlestitch.org/users/ursulawolz or email ursula.wolz@gmail.com</abstract>
<description>Turtlestitch is great for small scale embroidery. But can it be used for other fiber arts domains like quilting? In pursuit of an answer I&#39;ve explored turtle geometry, cartesian solutions and most recently event driven data capture. This lightening talk will demonstrate where turtles prevail as both a pedagogical and functionally useful tool. But it will also suggest that bridges to task often called &#39;advanced&#39; in both fiber art and computer programming may be problematic with turtles. This quick presentation will provide a half dozen small projects: (1) pure turtle, (2) pure cartesian, (3) pure data capture, (4) turtles with cartesian, (5) turtles and data capture, (6) cartesian and data capture.  Of course there will be commentary on how keeping an open mind about purpose and pedagogy allows the student to define their own goals while learning powerful ideas in fiber arts, computer science and math. Full turtlestitch projects used in this presentation will be labeled as &quot;SnapShotNN&quot; at https://turtlestitch.org/users/ursulawolz or email ursula.wolz@gmail.com</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='2191'>Ursula Wolz</person>
</persons>
</event>
<event guid='Ril7W0PUdO5-bLBVMi8zKw' id='708'>
<date>2024-07-31T20:04:00+02:00</date>
<start>18:04</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>708-create-with-beatblox</slug>
<title>Create with BeatBlox</title>
<subtitle></subtitle>
<track></track>
<abstract>BeatBlox is an extension of NetBlox. It consists of a family of blocks used for music creation. Adding blocks for playing instruments, playing sounds, recording and exporting audio.

In this talk I am going to give usages and examples of BeatBlox and also discuss how we aim to use BeatBlox to encourage students to learn Computer Science in an engaging way. Specifically showcasing is networking, distributed computing, improved sonics, and instrument library. Built with WebAudio API as its foundation, BeatBlox is optimized for modern browsers. </abstract>
<description>BeatBlox is an extension of NetBlox. It consists of a family of blocks used for music creation. Adding blocks for playing instruments, playing sounds, recording and exporting audio.

In this talk I am going to give usages and examples of BeatBlox and also discuss how we aim to use BeatBlox to encourage students to learn Computer Science in an engaging way. Specifically showcasing is networking, distributed computing, improved sonics, and instrument library. Built with WebAudio API as its foundation, BeatBlox is optimized for modern browsers. </description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='4894'>ebiwonop</person>
</persons>
</event>
<event guid='f3PNWkXhPHob-O3BqCnpig' id='710'>
<date>2024-07-31T20:11:00+02:00</date>
<start>18:11</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>710-arblox-an-augmented-reality-extension</slug>
<title>ARBlox, an Augmented Reality Extension </title>
<subtitle></subtitle>
<track></track>
<abstract>⁤Augmented Reality (AR) allows for novel methods of human-computer interaction impossible through traditional devices. ⁤⁤It also fosters engagement through its inherently visual nature. ⁤⁤ARBlox, a NetsBlox extension, expands upon the existing AR programming environments (e.g., Augmented-Reality Scratch) by offering expanded capabilities whilst remaining accessible. ⁤⁤ARBlox enables hand, face, and body tracking. ⁤⁤It also incorporates fiducial marker tracking (e.g., AprilTags) and 3D object rendering. ⁤⁤There is a key distinction between ARBlox and existing tools; ARBlox provides users unfettered access to data output. ⁤⁤As a result, users are able to create complex AR projects, such as mood classifiers and virtual instruments. ⁤⁤Using custom blocks, we are able to abstract away these complexities for beginners. ⁤⁤With the tools appeal to expert users and accessibility to novice learners, ARBlox has the potential to be a key tool in computer science education.

</abstract>
<description>⁤Augmented Reality (AR) allows for novel methods of human-computer interaction impossible through traditional devices. ⁤⁤It also fosters engagement through its inherently visual nature. ⁤⁤ARBlox, a NetsBlox extension, expands upon the existing AR programming environments (e.g., Augmented-Reality Scratch) by offering expanded capabilities whilst remaining accessible. ⁤⁤ARBlox enables hand, face, and body tracking. ⁤⁤It also incorporates fiducial marker tracking (e.g., AprilTags) and 3D object rendering. ⁤⁤There is a key distinction between ARBlox and existing tools; ARBlox provides users unfettered access to data output. ⁤⁤As a result, users are able to create complex AR projects, such as mood classifiers and virtual instruments. ⁤⁤Using custom blocks, we are able to abstract away these complexities for beginners. ⁤⁤With the tools appeal to expert users and accessibility to novice learners, ARBlox has the potential to be a key tool in computer science education.

</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='4916'>Saman Kittani</person>
</persons>
</event>
<event guid='o4d9AqgTzE6n3Gu4w3dTGg' id='723'>
<date>2024-07-31T20:18:00+02:00</date>
<start>18:18</start>
<duration>00:07</duration>
<room>Online Room 1</room>
<type>Lightning Talk</type>
<language></language>
<slug>723-what-coding-in-snap-might-look-like-10-years-from-now</slug>
<title>What coding in Snap! might look like 10 years from now</title>
<subtitle>Or, why the Apple Vision Pro and CoPilot might just be the first step...</subtitle>
<track></track>
<abstract>The Apple Vision Pro (AVP) has been called a “game-changing device”, allowing for a seamless blending of an augmented reality world locked in 3D space above your surroundings. Having explored coding in Snap! with the AVP, with the blocks floating above my table, and coding as simple as moving and pinching in space, it got us thinking what the experience might be like 10 years from now:

- Right now, Snap! on the AVP exists only within a Safari browser, but all the most immersive and intuitive AVP experiences are stand-alone apps designed specifically for the affordances the AVP provides. The Scratch foundation did this with “ScratchJr” on the iPad and iPhone. 
- There is an incredible amount of &quot;3D real estate&quot; with the AVP – enough that Snap! could have _all blocks visible at once_ (i.e., not hidden behind tabs), but floating in 3D space like a rolodex, perhaps, or side-by-side to the left and wrapping around a person.
- Selecting a paint editor could select Apple&#39;s built-in Freeform drawing app (or any other app a user wanted, like Photoshop). Similar for the sound editor.
- Everything done with gestures (what the mouse currently does) could also be accomplished by voice and looking at things – the voice equivalent of “keyboard input”.
- AI will probably support us in most of our everyday tasks ten years from now, so it&#39;s natural to think that there will be an AI assistant helping us code.  It might point out potential bugs before the green flag was clicked, suggest the next block based on what we&#39;re doing, or even author small utility blocks for us (if we asked for it and provided a clear spec).
- The &quot;stage&quot; might extend to be a real &quot;3D stage&quot;, with sprites available in 2D (as transparent PNG textures on always-facing-the-user polygons ala Doom) 3D. This might allow us to connect with the Alice user community and their resources. At the very least (even if we stuck with a 2D stage), we would be able to “turn” the stage to see what sprites were in front of and behind each other to debug a complicated scene where things were hidden.
- We might see some of the features users have dreamed about implemented: collaborative editing ala Google Docs, the ability to select a group of blocks (ala Illustrator) and delete/move/duplicate them, auto-save with a “history” feature that could roll back to any previous version, etc.

In summary, it&#39;s fun to dream about what life in Snap! will be like years from now, and this lightning talk will paint a picture of one such future.</abstract>
<description>The Apple Vision Pro (AVP) has been called a “game-changing device”, allowing for a seamless blending of an augmented reality world locked in 3D space above your surroundings. Having explored coding in Snap! with the AVP, with the blocks floating above my table, and coding as simple as moving and pinching in space, it got us thinking what the experience might be like 10 years from now:

- Right now, Snap! on the AVP exists only within a Safari browser, but all the most immersive and intuitive AVP experiences are stand-alone apps designed specifically for the affordances the AVP provides. The Scratch foundation did this with “ScratchJr” on the iPad and iPhone. 
- There is an incredible amount of &quot;3D real estate&quot; with the AVP – enough that Snap! could have _all blocks visible at once_ (i.e., not hidden behind tabs), but floating in 3D space like a rolodex, perhaps, or side-by-side to the left and wrapping around a person.
- Selecting a paint editor could select Apple&#39;s built-in Freeform drawing app (or any other app a user wanted, like Photoshop). Similar for the sound editor.
- Everything done with gestures (what the mouse currently does) could also be accomplished by voice and looking at things – the voice equivalent of “keyboard input”.
- AI will probably support us in most of our everyday tasks ten years from now, so it&#39;s natural to think that there will be an AI assistant helping us code.  It might point out potential bugs before the green flag was clicked, suggest the next block based on what we&#39;re doing, or even author small utility blocks for us (if we asked for it and provided a clear spec).
- The &quot;stage&quot; might extend to be a real &quot;3D stage&quot;, with sprites available in 2D (as transparent PNG textures on always-facing-the-user polygons ala Doom) 3D. This might allow us to connect with the Alice user community and their resources. At the very least (even if we stuck with a 2D stage), we would be able to “turn” the stage to see what sprites were in front of and behind each other to debug a complicated scene where things were hidden.
- We might see some of the features users have dreamed about implemented: collaborative editing ala Google Docs, the ability to select a group of blocks (ala Illustrator) and delete/move/duplicate them, auto-save with a “history” feature that could roll back to any previous version, etc.

In summary, it&#39;s fun to dream about what life in Snap! will be like years from now, and this lightning talk will paint a picture of one such future.</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='317'>Dan Garcia</person>
</persons>
</event>
<event guid='onlZ4fYeZI09g8GUvC9DrA' id='739'>
<date>2024-07-31T20:25:00+02:00</date>
<start>18:25</start>
<duration>00:10</duration>
<room>Online Room 1</room>
<type>Lightning Talks Discussion</type>
<language></language>
<slug>739-lightning-talks-3</slug>
<title>Lightning Talks 3</title>
<subtitle></subtitle>
<track></track>
<abstract>Five 7 minute talks - get enlightened :)</abstract>
<description>Five 7 minute talks - get enlightened :)</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
</persons>
</event>
<event guid='fJGCPiD0ftx2lihd1Hz3JA' id='743'>
<date>2024-07-31T20:40:00+02:00</date>
<start>18:40</start>
<duration>01:00</duration>
<room>Online Room 1</room>
<type>Keynote</type>
<language></language>
<slug>743-what-s-new-in-snap-10</slug>
<title>What&#39;s new in Snap! 10</title>
<subtitle></subtitle>
<track></track>
<abstract>Get to know what&#39;s new in Snap! </abstract>
<description>Get to know what&#39;s new in Snap! </description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='49'>Jens Mönig</person>
</persons>
</event>
<event guid='2BKdt0q6sF9BYIey-bw4fA' id='742'>
<date>2024-07-31T21:40:00+02:00</date>
<start>19:40</start>
<duration>00:10</duration>
<room>Online Room 1</room>
<type>Plenary Session</type>
<language></language>
<slug>742-goodbye-day-1</slug>
<title>Goodbye Day 1</title>
<subtitle></subtitle>
<track></track>
<abstract>Day 1 is a warp, we see you tomorrow for BoFs ❤️</abstract>
<description>Day 1 is a warp, we see you tomorrow for BoFs ❤️</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
</persons>
</event>
</room>
</day>
<day date='2024-08-01' index='2'>
<room name='Online Room 1'>
<event guid='OGAeaoIBRWPNqosWr-QZuQ' id='744'>
<date>2024-08-01T17:20:00+02:00</date>
<start>15:20</start>
<duration>00:10</duration>
<room>Online Room 1</room>
<type>Plenary Session</type>
<language></language>
<slug>744-welcome-to-bof-day</slug>
<title>Welcome to BoF Day</title>
<subtitle></subtitle>
<track></track>
<abstract>Let&#39;s discuss, share and create ❤️</abstract>
<description>Let&#39;s discuss, share and create ❤️</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='312'>Jadga Hügle</person>
</persons>
</event>
<event guid='rjPFiw0btOxerUGCCNvjSw' id='707'>
<date>2024-08-01T17:30:00+02:00</date>
<start>15:30</start>
<duration>01:00</duration>
<room>Online Room 1</room>
<type>Birds of a Feather</type>
<language></language>
<slug>707-using-snap-to-teach-mathematics</slug>
<title>Using Snap! to teach mathematics</title>
<subtitle>Using visualizations and algorithms to teach mathematics</subtitle>
<track></track>
<abstract>Snap! can be used in teaching mathematics to visualize mathematical concepts or to use algorithms to do calculations which are either difficult or tiresome to do with a calculator.
In this session participants have an opportunity to exchange examples from their own teaching practice.

Examples: [https://snap.berkeley.edu/user?username=mattgig](https://snap.berkeley.edu/user?username=mattgig)</abstract>
<description>Snap! can be used in teaching mathematics to visualize mathematical concepts or to use algorithms to do calculations which are either difficult or tiresome to do with a calculator.
In this session participants have an opportunity to exchange examples from their own teaching practice.

Examples: [https://snap.berkeley.edu/user?username=mattgig](https://snap.berkeley.edu/user?username=mattgig)</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='367'>Matthias Giger</person>
</persons>
</event>
<event guid='8dxmz6nEHGsM2E7BYU29Wg' id='720'>
<date>2024-08-01T18:40:00+02:00</date>
<start>16:40</start>
<duration>01:00</duration>
<room>Online Room 1</room>
<type>Birds of a Feather</type>
<language></language>
<slug>720-supporting-snap-students-in-a-transition-to-python</slug>
<title>Supporting Snap! Students in a Transition to Python</title>
<subtitle></subtitle>
<track></track>
<abstract>Most of us agree that Snap! is the perfect language for beginning programmers. When students ask us “What is the next language to learn?” the answer is often &quot;Python&quot;. That transition is often rocky: Python&#39;s 0-indexing, = vs ==, quotes around strings, while-vs-repeat-until, and no spaces allowed in variables or functions are just some of the challenges students face.

However, there&#39;s hope. Snap! new Python-like features like the “^” block, “item (numbers from __ to __) of _data_” hyperblock slicing, and allowing non-numeric values to be dictionary keys in “item _key_ of _data_” makes the transition that much easier. Snap! has the “codification” example that can export code to Python. There&#39;s a demo “split-screen” website that lets someone code Snap! on the left and the Python equivalent shows up on the right (“live” codification, if you will). UC Berkeley&#39;s Beauty and Joy of Computing (BJC) course provides lectures, videos, discussion worksheets, and labs to help the transition. EDC&#39;s AP CS Principles BJC site has an &quot;Other Programming Languages&quot; page with resources they are developing, since the AP CSP exam&#39;s questions are now in Python (they had been in a Snap!-like pseudocode-language). Are there other opportunities and resources like that? Could Large Language Models help? 

Let&#39;s gather in this BoF session to discuss experiences guiding students through that process, share software and curricular support, etc.</abstract>
<description>Most of us agree that Snap! is the perfect language for beginning programmers. When students ask us “What is the next language to learn?” the answer is often &quot;Python&quot;. That transition is often rocky: Python&#39;s 0-indexing, = vs ==, quotes around strings, while-vs-repeat-until, and no spaces allowed in variables or functions are just some of the challenges students face.

However, there&#39;s hope. Snap! new Python-like features like the “^” block, “item (numbers from __ to __) of _data_” hyperblock slicing, and allowing non-numeric values to be dictionary keys in “item _key_ of _data_” makes the transition that much easier. Snap! has the “codification” example that can export code to Python. There&#39;s a demo “split-screen” website that lets someone code Snap! on the left and the Python equivalent shows up on the right (“live” codification, if you will). UC Berkeley&#39;s Beauty and Joy of Computing (BJC) course provides lectures, videos, discussion worksheets, and labs to help the transition. EDC&#39;s AP CS Principles BJC site has an &quot;Other Programming Languages&quot; page with resources they are developing, since the AP CSP exam&#39;s questions are now in Python (they had been in a Snap!-like pseudocode-language). Are there other opportunities and resources like that? Could Large Language Models help? 

Let&#39;s gather in this BoF session to discuss experiences guiding students through that process, share software and curricular support, etc.</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='1543'>Victoria Phelps</person>
<person id='4969'>Parinaz Dastur</person>
<person id='597'>Talia Ye</person>
<person id='1403'>Delnavaz Dastur</person>
<person id='287'>Mary Fries</person>
<person id='1044'>Yuan Garcia</person>
<person id='2'>Michael Ball</person>
<person id='317'>Dan Garcia</person>
</persons>
</event>
<event guid='PyfRJ7aZII8VBxtR2oNZkA' id='745'>
<date>2024-08-01T19:40:00+02:00</date>
<start>17:40</start>
<duration>00:25</duration>
<room>Online Room 1</room>
<type>Break</type>
<language></language>
<slug>745-break</slug>
<title>Break</title>
<subtitle></subtitle>
<track></track>
<abstract>Enjoy a coffee or grab some food, we&#39;ll see you soon :)</abstract>
<description>Enjoy a coffee or grab some food, we&#39;ll see you soon :)</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
</persons>
</event>
<event guid='rEla3jM2pRXUsRrYmWWFVg' id='721'>
<date>2024-08-01T20:05:00+02:00</date>
<start>18:05</start>
<duration>01:00</duration>
<room>Online Room 1</room>
<type>Birds of a Feather</type>
<language></language>
<slug>721-translating-snap-and-curriculum-built-on-snap-to-non-english-languages</slug>
<title>Translating Snap! (and curriculum built on Snap!) to non-English languages</title>
<subtitle>...objects in mirror are larger than they appear...</subtitle>
<track></track>
<abstract>Snap! is available in 47 languages, and has a truly international community of users. The architecture that has made that possible, and the efforts of the folks who have helped with those translations, are deeply appreciated.

However, having worked for the last two years to translate the Beauty and Joy of Computing (BJC) AP CS Principles curriculum to Spanish, we have discovered that what might seem like a simple task of moving a curriculum to another language, revealed a remarkable list of inter-dependencies on English and difficulties in managing that large project. So many tasks were labor-intensive, and scripting didn&#39;t help. Among the challenges:

- There was no automated way to translate all the hundreds of PNG images of blocks. “Smart PNGs” (with the XML embedded in the metadata) is amazing and now allows for automation, but the images were created _before_ that feature existed. Animated gifs showing how to do something had to be recreated by hand.
- Not all the blocks (or error messages, or menu items, or dialog boxes) in Snap! have been translated to all 47 languages, since once a translation is created, and new features are added, all translation teams need to be informed about the additions. Perhaps a team needs to realize it&#39;s not a one-and-done process and check back in every 6 months?
- Snap! help text currently only exists in English
- A curriculum translation often takes a fair amount of time, and sometimes the curriculum or Snap! changes before it&#39;s done, so managing those different versions can be difficult. 

Could automation and large language models help? Should there be a “translation regression test” feature? It would go through all the blocks, libraries, menus, dialogs, help screens, etc. and create a PDF with three columns: the English on the left, the translation on the right, and the place in the translation file to make the change.

These are just some ideas we&#39;d like to explore in this BoF. If you have been involved in any translations of Snap! (or curriculum built on it), please share best (and worst) practices. If you have ideas to leverage automation or tooling to help with future translations, please attend and share your thoughts!</abstract>
<description>Snap! is available in 47 languages, and has a truly international community of users. The architecture that has made that possible, and the efforts of the folks who have helped with those translations, are deeply appreciated.

However, having worked for the last two years to translate the Beauty and Joy of Computing (BJC) AP CS Principles curriculum to Spanish, we have discovered that what might seem like a simple task of moving a curriculum to another language, revealed a remarkable list of inter-dependencies on English and difficulties in managing that large project. So many tasks were labor-intensive, and scripting didn&#39;t help. Among the challenges:

- There was no automated way to translate all the hundreds of PNG images of blocks. “Smart PNGs” (with the XML embedded in the metadata) is amazing and now allows for automation, but the images were created _before_ that feature existed. Animated gifs showing how to do something had to be recreated by hand.
- Not all the blocks (or error messages, or menu items, or dialog boxes) in Snap! have been translated to all 47 languages, since once a translation is created, and new features are added, all translation teams need to be informed about the additions. Perhaps a team needs to realize it&#39;s not a one-and-done process and check back in every 6 months?
- Snap! help text currently only exists in English
- A curriculum translation often takes a fair amount of time, and sometimes the curriculum or Snap! changes before it&#39;s done, so managing those different versions can be difficult. 

Could automation and large language models help? Should there be a “translation regression test” feature? It would go through all the blocks, libraries, menus, dialogs, help screens, etc. and create a PDF with three columns: the English on the left, the translation on the right, and the place in the translation file to make the change.

These are just some ideas we&#39;d like to explore in this BoF. If you have been involved in any translations of Snap! (or curriculum built on it), please share best (and worst) practices. If you have ideas to leverage automation or tooling to help with future translations, please attend and share your thoughts!</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='287'>Mary Fries</person>
<person id='317'>Dan Garcia</person>
</persons>
</event>
<event guid='Ty7I4c0SgyBa10G1c4GxXA' id='747'>
<date>2024-08-01T21:05:00+02:00</date>
<start>19:05</start>
<duration>00:10</duration>
<room>Online Room 1</room>
<type>Plenary Session</type>
<language></language>
<slug>747-goodbye-day-2</slug>
<title>Goodbye Day 2</title>
<subtitle></subtitle>
<track></track>
<abstract>Snap!shot is already over again.
We&#39;ll see you next year. </abstract>
<description>Snap!shot is already over again.
We&#39;ll see you next year. </description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
</persons>
</event>
</room>
<room name='Online Room 2'>
<event guid='mF9xViAPu13kH7qQgD7lYA' id='717'>
<date>2024-08-01T18:40:00+02:00</date>
<start>16:40</start>
<duration>01:00</duration>
<room>Online Room 2</room>
<type>Birds of a Feather</type>
<language></language>
<slug>717-thinking-about-thinking-about-coding</slug>
<title>Thinking about thinking about coding</title>
<subtitle>A conversation about how to tackle a coding project in Snap!</subtitle>
<track></track>
<abstract>In the past few years I&#39;ve used Turtlestitch for &#39;serious&#39; projects involving embroidery and quilting.  I&#39;ve also taught intro computer science for over 40 years. I&#39;ve developed a systematic way of tackling a project by experimenting with &#39;hat blocks&#39;, then adding variables, then creating my own blocks only where necessary. I have a bunch of tricks as well, such as gathering my &#39;paints&#39; (e.g. blocks I need) into the workspace and then duplicating them rather than switching between block categories.  I&#39;m sure other experienced blocks language coders have their own systems. I&#39;d like to share them in a semi structured setting.  I&#39;ll spend about 5 minutes demoing what I do and then open it up for discussion.  Others are welcome to show how they tackle a problem as well.  In the spirit of &quot;you can&#39;t think seriously about thinking without thinking about thinking about something&quot; the goal here is, as a group to identify techniques for navigating the blocks environment in a way that keeps the coder focused on the goal while simultaneously supporting the serendipity of discovery. A part of this, of course, is articulating how to deal with &quot;why isn&#39;t this working?&quot; This is a collaborative conversation, not a formal presentation.</abstract>
<description>In the past few years I&#39;ve used Turtlestitch for &#39;serious&#39; projects involving embroidery and quilting.  I&#39;ve also taught intro computer science for over 40 years. I&#39;ve developed a systematic way of tackling a project by experimenting with &#39;hat blocks&#39;, then adding variables, then creating my own blocks only where necessary. I have a bunch of tricks as well, such as gathering my &#39;paints&#39; (e.g. blocks I need) into the workspace and then duplicating them rather than switching between block categories.  I&#39;m sure other experienced blocks language coders have their own systems. I&#39;d like to share them in a semi structured setting.  I&#39;ll spend about 5 minutes demoing what I do and then open it up for discussion.  Others are welcome to show how they tackle a problem as well.  In the spirit of &quot;you can&#39;t think seriously about thinking without thinking about thinking about something&quot; the goal here is, as a group to identify techniques for navigating the blocks environment in a way that keeps the coder focused on the goal while simultaneously supporting the serendipity of discovery. A part of this, of course, is articulating how to deal with &quot;why isn&#39;t this working?&quot; This is a collaborative conversation, not a formal presentation.</description>
<recording>
<license />
<optout>false</optout>
</recording>
<persons>
<person id='2191'>Ursula Wolz</person>
</persons>
</event>
</room>
</day>
</schedule>
