Skip to content

Commit

Permalink
Merge branch 'main' into feat/docs/setup-landing
Browse files Browse the repository at this point in the history
  • Loading branch information
michalsek committed Dec 19, 2024
2 parents be17db6 + 365185a commit 98e9f48
Show file tree
Hide file tree
Showing 19 changed files with 2,517 additions and 680 deletions.
50 changes: 48 additions & 2 deletions packages/audiodocs/docs/fundamentals/lets-make-some-noise.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -62,8 +62,40 @@ We have used `decodeAudioDataSource` method of the `AudioContext`, which takes u

## Play the audio

Last and final step is to create `AudioBufferSourceNode`. It is one of few nodes that are starting points in the audio graph and can "produce" sounds. All of them have `start` and `stop` functions.
We can assign the `audioBuffer` obtained in previous step to finally play it!
Last and final step is to create `AudioBufferSourceNode`, connect it to the `AudioContext` destination and start playing the sound. For the purpose of this guide, we will play the sound for only 10 seconds.

```jsx {15-16,18-20}
import React from 'react';
import * as FileSystem from 'expo-file-system';
import { View, Button } from 'react-native';
import { AudioContext } from 'react-native-audio-api';

export default function App() {
const handlePlay = async () => {
const audioContext = new AudioContext();

const audioBuffer = await FileSystem.downloadAsync(
'https://software-mansion-labs.github.io/react-native-audio-api/audio/music/example-music-01.mp3',
`${FileSystem.documentDirectory}/example-music-01.mp3`
).then(({ uri }) => audioContext.decodeAudioDataSource(uri));

const playerNode = audioContext.createBufferSource();
playerNode.buffer = audioBuffer;

playerNode.connect(audioContext.destination);
playerNode.start(audioContext.currentTime);
playerNode.stop(audioContext.currentTime + 10);
};

return (
<View style={{ flex: 1, justifyContent: 'center', alignItems: 'center' }}>
<Button onPress={handlePlay} title="Play sound!" />
</View>
);
}
```

And that's it! You have just played Your first sound using react-native-audio-api. You can hear how it works in the live example below:

import InteractiveExample from '@site/src/components/InteractiveExample';
import LetsMakeSomeNoise from '@site/src/examples/LetsMakeSomeNoise/Component';
Expand All @@ -76,3 +108,17 @@ import LetsMakeSomeNoiseSrc from '!!raw-loader!@site/src/examples/LetsMakeSomeNo
In web environment you can use `decodeAudioDataSource` directly on the asset url, without the need to download it first.

:::

## Summary

In this guide, we have learned how to create a simple audio player using `AudioContext` and `AudioBufferSourceNode` as well as how we can load audio data from a remote source. To sum up:

- `AudioContext` is the main object that controls the audio graph.
- `decodeAudioDataSource` method can be used to load audio data from a local audio source in form of `AudioBuffer`.
- `AudioBufferSourceNode` can be used to any `AudioBuffer`.
- In order to hear the sounds, we need to connect the source node to the destination node exposed by `AudioContext`.
- We can control the playback of the sound using `start` and `stop` methods of the `AudioBufferSourceNode` (And other source nodes which we show later).

## What's next?

In [the next section](/fundamentals/making-a-piano-keyboard) we will learn more about how the audio graph works, what are audio params and how we can use them to create a simple piano keyboard.
Loading

0 comments on commit 98e9f48

Please sign in to comment.