You're Invited:Meet the Socket Team at BlackHat and DEF CON in Las Vegas, Aug 4-6.RSVP
Socket
Book a DemoInstallSign in
Socket

react-voice-visualizer

Package Overview
Dependencies
Maintainers
1
Versions
44
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

react-voice-visualizer - npm Package Compare versions

Package version was removed
This package version has been unpublished, mostly likely due to security reasons

Comparing version

to
1.5.14

4

dist/hooks/useLatest.d.ts
type UseLatestReturnType<T> = {
readonly current: T;
};
declare function useLatest<T>(value: T): UseLatestReturnType<T>;
export default useLatest;
export declare function useLatest<T>(value: T): UseLatestReturnType<T>;
export {};
import { Controls, useVoiceVisualizerParams } from "../types/types.ts";
declare function useVoiceVisualizer({ onStartRecording, onStopRecording, onPausedRecording, onResumedRecording, onClearCanvas, onEndAudioPlayback, onStartAudioPlayback, onPausedAudioPlayback, onResumedAudioPlayback, }?: useVoiceVisualizerParams): Controls;
declare function useVoiceVisualizer({ onStartRecording, onStopRecording, onPausedRecording, onResumedRecording, onClearCanvas, onEndAudioPlayback, onStartAudioPlayback, onPausedAudioPlayback, onResumedAudioPlayback, onErrorPlayingAudio, }?: useVoiceVisualizerParams): Controls;
export default useVoiceVisualizer;
(function(){"use strict";(e=>{try{if(typeof window>"u")return;var i=document.createElement("style");i.appendChild(document.createTextNode(e)),document.head.appendChild(i)}catch(o){console.error("vite-plugin-css-injected-by-js",o)}})(".voice-visualizer__buttons-container{display:flex;justify-content:center;align-items:center;column-gap:20px;row-gap:15px;flex-wrap:wrap;margin-bottom:40px}.voice-visualizer__btn-center{box-sizing:border-box;flex-shrink:0;width:60px;height:60px;padding:0;display:flex;justify-content:center;align-items:center;border-radius:50%;background-color:#fff;border:4px solid #c5c5c5;outline:none;cursor:pointer;transition:border-color .3s,background-color .3s}.voice-visualizer__btn-center:hover{background-color:#eaeaea}.voice-visualizer__btn-center>img{width:auto;height:50%;max-height:30px}.voice-visualizer__btn-center.voice-visualizer__btn-center-pause{background-color:#ff3030}.voice-visualizer__btn-center.voice-visualizer__btn-center-pause:hover{background-color:#ff4f4f}.voice-visualizer__btn-center.voice-visualizer__btn-center-pause>img{height:50%;max-height:16px}.voice-visualizer__btn-center:hover{border:4px solid #9f9f9f}.voice-visualizer__btn-left{box-sizing:border-box;flex-shrink:0;width:60px;height:60px;padding:0;display:flex;justify-content:center;align-items:center;border-radius:50%;background-color:#ff3030;border:4px solid #c5c5c5;outline:none;cursor:pointer;transition:border-color .3s,background-color .3s,opacity .3s}.voice-visualizer__btn-left:hover{background-color:#ff4f4f}.voice-visualizer__btn-left:disabled{opacity:.6;background-color:#ff3030}.voice-visualizer__btn-left.voice-visualizer__btn-left-microphone{background-color:#fff}.voice-visualizer__btn-left.voice-visualizer__btn-left-microphone>img{width:auto;height:50%;max-height:30px}.voice-visualizer__btn-left>img{width:auto;height:50%;max-height:16px}.voice-visualizer__btn-left:hover{border:4px solid #9f9f9f}.voice-visualizer__btn{box-sizing:border-box;min-width:100px;min-height:60px;padding:5px 20px;border-radius:40px;font-size:15px;background-color:#f0f0f0;transition:background-color .3s,opacity .3s}.voice-visualizer__btn:disabled{opacity:.8;background-color:#f0f0f0}.voice-visualizer__btn:hover{background-color:#bebebe}.voice-visualizer__canvas-container{position:relative;width:fit-content;margin:0 auto;overflow:hidden}.voice-visualizer__canvas-container canvas{display:block}.voice-visualizer__canvas-microphone-btn{position:absolute;top:50%;left:50%;width:auto;max-width:12%;min-width:24px;height:50%;max-height:100px;background-color:transparent;border:none;outline:none;transform:translate(-50%,-50%)}.voice-visualizer__canvas-microphone-icon{width:100%;height:100%;will-change:transform;transition:transform .3s}.voice-visualizer__canvas-microphone-btn:hover .voice-visualizer__canvas-microphone-icon{transform:scale(1.03)}.voice-visualizer__canvas-audio-wave-icon{position:absolute;top:50%;left:50%;width:auto;max-width:40%;height:40%;max-height:100px;transform:translate(-118%,-50%) scale(-1)}.voice-visualizer__canvas-audio-wave-icon2{transform:translate(18%,-50%)}.voice-visualizer__canvas-audio-processing{position:absolute;top:50%;left:50%;margin:0;transform:translate(-50%,-50%)}.voice-visualizer__progress-indicator-hovered{position:absolute;top:0;pointer-events:none;height:100%;width:1px;background-color:#85858599}.voice-visualizer__progress-indicator-hovered-time{position:absolute;top:3%;left:1px;width:fit-content;margin:0;padding:0 7px;opacity:.8;font-size:12px;border-radius:0 4px 4px 0;background-color:#575757;text-align:left}.voice-visualizer__progress-indicator-hovered-time.voice-visualizer__progress-indicator-hovered-time-left{left:unset;right:1px;border-radius:4px 0 0 4px}.voice-visualizer__progress-indicator{position:absolute;top:0;pointer-events:none;height:100%;width:1px;background-color:#efefef}.voice-visualizer__progress-indicator-time{position:absolute;top:3%;left:1px;width:fit-content;box-sizing:border-box;min-width:37px;margin:0;padding:0 7px;font-size:12px;border-radius:0 4px 4px 0;text-align:left;color:#000;font-weight:500;background-color:#efefef}.voice-visualizer__progress-indicator-time.voice-visualizer__progress-indicator-time-left{left:unset;right:1px;border-radius:4px 0 0 4px}.voice-visualizer__audio-info-container{box-sizing:border-box;height:55px;display:flex;align-items:center;justify-content:center;gap:30px}.voice-visualizer__audio-info-time{margin:15px 0;min-width:38px;text-align:left}.voice-visualizer__visually-hidden{position:absolute;width:1px;height:1px;margin:-1px;padding:0;border:4px solid #c5c5c5;white-space:nowrap;clip-path:inset(100%);clip:rect(0 0 0 0);overflow:hidden}")})();
import { jsx as a, jsxs as de, Fragment as Ue } from "react/jsx-runtime";
import { useState as l, useRef as y, useCallback as tt, useLayoutEffect as We, forwardRef as rt, useEffect as Z } from "react";
const He = ({
import { jsx as a, jsxs as fe, Fragment as We } from "react/jsx-runtime";
import { useState as u, useRef as S, useCallback as nt, useLayoutEffect as qe, forwardRef as it, useEffect as P } from "react";
const Ze = ({
canvas: e,
backgroundColor: t
}) => {
const n = e.height, r = e.width, c = Math.round(r / 2), s = e.getContext("2d");
return s ? (s.clearRect(0, 0, r, n), t !== "transparent" && (s.fillStyle = t, s.fillRect(0, 0, r, n)), { context: s, height: n, width: r, halfWidth: c }) : null;
}, De = ({
const n = e.height, r = e.width, c = Math.round(r / 2), h = e.getContext("2d");
return h ? (h.clearRect(0, 0, r, n), t !== "transparent" && (h.fillStyle = t, h.fillRect(0, 0, r, n)), { context: h, height: n, width: r, halfWidth: c }) : null;
}, be = ({
context: e,

@@ -16,7 +16,7 @@ color: t,

y: c,
w: s,
h: g
w: h,
h: d
}) => {
e.fillStyle = t, e.beginPath(), e.roundRect ? (e.roundRect(r, c, s, g, n), e.fill()) : e.fillRect(r, c, s, g);
}, nt = ({
e.fillStyle = t, e.beginPath(), e.roundRect ? (e.roundRect(r, c, h, d, n), e.fill()) : e.fillRect(r, c, h, d);
}, ct = ({
barsData: e,

@@ -27,21 +27,21 @@ canvas: t,

backgroundColor: c,
mainBarColor: s,
secondaryBarColor: g,
currentAudioTime: v = 0,
rounded: M,
duration: u
mainBarColor: h,
secondaryBarColor: d,
currentAudioTime: m = 0,
rounded: I,
duration: f
}) => {
const I = He({ canvas: t, backgroundColor: c });
if (!I)
const z = Ze({ canvas: t, backgroundColor: c });
if (!z)
return;
const { context: f, height: w } = I, L = v / u;
e.forEach((o, p) => {
const H = p / e.length, m = L > H;
De({
context: f,
color: m ? g : s,
rounded: M,
x: p * (n + r * n),
y: w / 2 - o.max,
h: o.max * 2,
const { context: g, height: N } = z, w = m / f;
e.forEach((s, M) => {
const R = M / e.length, l = w > R;
be({
context: g,
color: l ? d : h,
rounded: I,
x: M * (n + r * n),
y: N / 2 - s.max,
h: s.max * 2,
w: n

@@ -51,3 +51,3 @@ });

};
function it({
function ot({
context: e,

@@ -58,15 +58,15 @@ color: t,

height: c,
barWidth: s
barWidth: h
}) {
De({
be({
context: e,
color: t,
rounded: n,
x: r / 2 + s / 2,
x: r / 2 + h / 2,
y: c / 2 - 1,
h: 2,
w: r - (r / 2 + s / 2)
w: r - (r / 2 + h / 2)
});
}
const ct = ({
const st = ({
audioData: e,

@@ -77,64 +77,64 @@ unit: t,

canvas: c,
isRecordingInProgress: s,
isPausedRecording: g,
picks: v,
backgroundColor: M,
barWidth: u,
mainBarColor: I,
secondaryBarColor: f,
rounded: w,
animateCurrentPick: L,
fullscreen: o
isRecordingInProgress: h,
isPausedRecording: d,
picks: m,
backgroundColor: I,
barWidth: f,
mainBarColor: z,
secondaryBarColor: g,
rounded: N,
animateCurrentPick: w,
fullscreen: s
}) => {
const p = He({ canvas: c, backgroundColor: M });
if (!p)
const M = Ze({ canvas: c, backgroundColor: I });
if (!M)
return;
const { context: H, height: m, width: x, halfWidth: j } = p;
if (e != null && e.length && s) {
const F = Math.max(...e);
if (!g) {
if (r.current >= u) {
const { context: R, height: l, width: Z, halfWidth: b } = M;
if (e != null && e.length && h) {
const x = Math.max(...e);
if (!d) {
if (r.current >= f) {
r.current = 0;
const D = (m - F / 258 * m) / m * 100, U = (-m + F / 258 * m * 2) / m * 100, V = n.current === u ? {
startY: D,
barHeight: U
const T = (l - x / 258 * l) / l * 100, V = (-l + x / 258 * l * 2) / l * 100, U = n.current === f ? {
startY: T,
barHeight: V
} : null;
n.current >= t ? n.current = u : n.current += u, v.length > (o ? x : j) / u && v.pop(), v.unshift(V);
n.current >= t ? n.current = f : n.current += f, m.length > (s ? Z : b) / f && m.pop(), m.unshift(U);
}
r.current += 1;
}
!o && Q(), L && De({
context: H,
rounded: w,
color: I,
x: o ? x : j,
y: m - F / 258 * m,
h: -m + F / 258 * m * 2,
w: u
!s && ze(), w && be({
context: R,
rounded: N,
color: z,
x: s ? Z : b,
y: l - x / 258 * l,
h: -l + x / 258 * l * 2,
w: f
});
let B = (o ? x : j) - r.current;
v.forEach((D) => {
D && De({
context: H,
color: I,
rounded: w,
x: B,
y: D.startY * m / 100 > m / 2 - 1 ? m / 2 - 1 : D.startY * m / 100,
h: D.barHeight * m / 100 > 2 ? D.barHeight * m / 100 : 2,
w: u
}), B -= u;
let W = (s ? Z : b) - r.current;
m.forEach((T) => {
T && be({
context: R,
color: z,
rounded: N,
x: W,
y: T.startY * l / 100 > l / 2 - 1 ? l / 2 - 1 : T.startY * l / 100,
h: T.barHeight * l / 100 > 2 ? T.barHeight * l / 100 : 2,
w: f
}), W -= f;
});
} else
v.length = 0;
function Q() {
it({
context: H,
color: f,
rounded: w,
width: x,
height: m,
barWidth: u
m.length = 0;
function ze() {
ot({
context: R,
color: g,
rounded: N,
width: Z,
height: l,
barWidth: f
});
}
}, ke = (e) => {
}, Xe = (e) => {
const t = Math.floor(e / 3600), n = Math.floor(e % 3600 / 60), r = e % 60, c = Math.floor(

@@ -153,3 +153,3 @@ (r - Math.floor(r)) * 1e3

).charAt(0)}`;
}, ot = (e) => {
}, at = (e) => {
const t = Math.floor(e / 1e3), n = Math.floor(t / 3600), r = Math.floor(t % 3600 / 60), c = t % 60;

@@ -161,3 +161,3 @@ return n > 0 ? `${String(n).padStart(2, "0")}:${String(r).padStart(

};
function Ye(e) {
function ke(e) {
if (typeof e == "string") {

@@ -170,3 +170,3 @@ const t = Number(e);

}
const st = ({
const ut = ({
bufferData: e,

@@ -178,22 +178,22 @@ height: t,

}) => {
const s = n / (r + c * r), g = Math.floor(e.length / s), v = t / 2;
let M = [], u = 0;
for (let I = 0; I < s; I++) {
const f = [];
let w = 0;
for (let o = 0; o < g && I * g + o < e.length; o++) {
const p = e[I * g + o];
p > 0 && (f.push(p), w++);
const h = n / (r + c * r), d = Math.floor(e.length / h), m = t / 2;
let I = [], f = 0;
for (let z = 0; z < h; z++) {
const g = [];
let N = 0;
for (let s = 0; s < d && z * d + s < e.length; s++) {
const M = e[z * d + s];
M > 0 && (g.push(M), N++);
}
const L = f.reduce((o, p) => o + p, 0) / w;
L > u && (u = L), M.push({ max: L });
const w = g.reduce((s, M) => s + M, 0) / N;
w > f && (f = w), I.push({ max: w });
}
if (v * 0.95 > u * v) {
const I = v * 0.95 / u;
M = M.map((f) => ({
max: f.max > 0.01 ? f.max * I : 1
if (m * 0.95 > f * m) {
const z = m * 0.95 / f;
I = I.map((g) => ({
max: g.max > 0.01 ? g.max * z : 1
}));
}
return M;
}, at = (e) => {
return I;
}, ht = (e) => {
if (!e)

@@ -203,3 +203,3 @@ return "";

return t && t.length >= 2 ? `.${t[1]}` : "";
}, ut = (e) => {
}, lt = (e) => {
const t = Math.floor(e / 3600), n = Math.floor(e % 3600 / 60), r = e % 60, c = Math.floor(

@@ -216,3 +216,3 @@ (r - Math.floor(r)) * 1e3

).charAt(0)}${String(c).charAt(1)}s`;
}, ht = (e) => {
}, mt = (e) => {
onmessage = (t) => {

@@ -222,3 +222,3 @@ postMessage(e(t.data));

};
function lt({
function vt({
fn: e,

@@ -228,22 +228,22 @@ initialValue: t,

}) {
const [r, c] = l(t);
const [r, c] = u(t);
return {
result: r,
setResult: c,
run: (g) => {
const v = new Worker(
run: (d) => {
const m = new Worker(
// eslint-disable-next-line @typescript-eslint/restrict-template-expressions
URL.createObjectURL(new Blob([`(${ht})(${e})`]))
URL.createObjectURL(new Blob([`(${mt})(${e})`]))
);
v.onmessage = (M) => {
M.data && (c(M.data), n && n(), v.terminate());
}, v.onerror = (M) => {
console.error(M.message), v.terminate();
}, v.postMessage(g);
m.onmessage = (I) => {
I.data && (c(I.data), n && n(), m.terminate());
}, m.onerror = (I) => {
console.error(I.message), m.terminate();
}, m.postMessage(d);
}
};
}
const mt = (e, t = 250) => {
const n = y();
return tt(
const dt = (e, t = 250) => {
const n = S();
return nt(
// eslint-disable-next-line @typescript-eslint/no-explicit-any

@@ -259,3 +259,9 @@ (...r) => {

};
const vt = ({
function ft(e) {
const t = S(e);
return qe(() => {
t.current = e;
}, [e]), t;
}
const zt = ({
color: e = "#000000",

@@ -282,3 +288,3 @@ stroke: t = 2,

}
), Ge = ({
), Je = ({
color: e = "#FFFFFF",

@@ -300,10 +306,3 @@ reflect: t

}
), Be = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjMiIGhlaWdodD0iMzMiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHBhdGggZD0iTTEuMSAxNi43MmMwIDMgLjk2IDUuOCAzLjYxIDcuOTVhOS45NiA5Ljk2IDAgMCAwIDYuNSAyLjE3bTAgMHY0LjM0aDQuMzQtOC42N200LjM0LTQuMzRjMi4zNSAwIDQuNDItLjQ4IDYuNS0yLjE3YTkuODcgOS44NyAwIDAgMCAzLjYxLTcuOTVNMTEuMjIgMS44MmMtMS40NSAwLTIuNS4zNy0zLjMuOTNhNS42IDUuNiAwIDAgMC0xLjg0IDIuNGMtLjkgMi4wNi0xLjEgNC43Ny0xLjEgNy4yNCAwIDIuNDYuMiA1LjE3IDEuMSA3LjI0YTUuNiA1LjYgMCAwIDAgMS44NCAyLjRjLjguNTUgMS44NS45MiAzLjMuOTIgMS40NCAwIDIuNS0uMzcgMy4yOS0uOTNhNS42IDUuNiAwIDAgMCAxLjg0LTIuNGMuOS0yLjA2IDEuMS00Ljc3IDEuMS03LjIzIDAtMi40Ny0uMi01LjE4LTEuMS03LjI0YTUuNiA1LjYgMCAwIDAtMS44NC0yLjQgNS41MiA1LjUyIDAgMCAwLTMuMy0uOTNaIiBzdHJva2U9IiMwMDAiIHN0cm9rZS1saW5lY2FwPSJyb3VuZCIgc3Ryb2tlLWxpbmVqb2luPSJyb3VuZCIvPgo8L3N2Zz4K", dt = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjYiIGhlaWdodD0iMjQiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHBhdGggZD0iTTE4Ljc1IDYuMTZjNC4zMSAyLjYgNi40NiAzLjkgNi40NiA1Ljg0IDAgMS45NS0yLjE1IDMuMjQtNi40NiA1Ljg0bC00Ljg0IDIuOTJjLTQuMzEgMi42LTYuNDYgMy44OS04LjA4IDIuOTItMS42Mi0uOTgtMS42Mi0zLjU3LTEuNjItOC43NlY5LjA4YzAtNS4xOSAwLTcuNzggMS42Mi04Ljc2IDEuNjItLjk3IDMuNzcuMzMgOC4wOCAyLjkybDQuODQgMi45MloiIGZpbGw9IiNmZmYiLz4KPC9zdmc+Cg==", Pe = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjEiIGhlaWdodD0iMjkiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHBhdGggZD0iTTE0IDMuNWEzLjUgMy41IDAgMSAxIDcgMHYyMmEzLjUgMy41IDAgMSAxLTcgMHYtMjJaIiBmaWxsPSIjZmZmIi8+CiAgPHJlY3Qgd2lkdGg9IjciIGhlaWdodD0iMjkiIHJ4PSIzLjUiIGZpbGw9IiNmZmYiLz4KPC9zdmc+Cg==", ft = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjciIGhlaWdodD0iMjUiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHJlY3QgeD0iLjIxIiB3aWR0aD0iMjYiIGhlaWdodD0iMjUiIHJ4PSI1IiBmaWxsPSIjZmZmIi8+Cjwvc3ZnPgo=";
function zt(e) {
const t = y(e);
return We(() => {
t.current = e;
}, [e]), t;
}
const It = rt(
), Qe = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjMiIGhlaWdodD0iMzMiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHBhdGggZD0iTTEuMSAxNi43MmMwIDMgLjk2IDUuOCAzLjYxIDcuOTVhOS45NiA5Ljk2IDAgMCAwIDYuNSAyLjE3bTAgMHY0LjM0aDQuMzQtOC42N200LjM0LTQuMzRjMi4zNSAwIDQuNDItLjQ4IDYuNS0yLjE3YTkuODcgOS44NyAwIDAgMCAzLjYxLTcuOTVNMTEuMjIgMS44MmMtMS40NSAwLTIuNS4zNy0zLjMuOTNhNS42IDUuNiAwIDAgMC0xLjg0IDIuNGMtLjkgMi4wNi0xLjEgNC43Ny0xLjEgNy4yNCAwIDIuNDYuMiA1LjE3IDEuMSA3LjI0YTUuNiA1LjYgMCAwIDAgMS44NCAyLjRjLjguNTUgMS44NS45MiAzLjMuOTIgMS40NCAwIDIuNS0uMzcgMy4yOS0uOTNhNS42IDUuNiAwIDAgMCAxLjg0LTIuNGMuOS0yLjA2IDEuMS00Ljc3IDEuMS03LjIzIDAtMi40Ny0uMi01LjE4LTEuMS03LjI0YTUuNiA1LjYgMCAwIDAtMS44NC0yLjQgNS41MiA1LjUyIDAgMCAwLTMuMy0uOTNaIiBzdHJva2U9IiMwMDAiIHN0cm9rZS1saW5lY2FwPSJyb3VuZCIgc3Ryb2tlLWxpbmVqb2luPSJyb3VuZCIvPgo8L3N2Zz4K", gt = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjYiIGhlaWdodD0iMjQiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHBhdGggZD0iTTE4Ljc1IDYuMTZjNC4zMSAyLjYgNi40NiAzLjkgNi40NiA1Ljg0IDAgMS45NS0yLjE1IDMuMjQtNi40NiA1Ljg0bC00Ljg0IDIuOTJjLTQuMzEgMi42LTYuNDYgMy44OS04LjA4IDIuOTItMS42Mi0uOTgtMS42Mi0zLjU3LTEuNjItOC43NlY5LjA4YzAtNS4xOSAwLTcuNzggMS42Mi04Ljc2IDEuNjItLjk3IDMuNzcuMzMgOC4wOCAyLjkybDQuODQgMi45MloiIGZpbGw9IiNmZmYiLz4KPC9zdmc+Cg==", Ve = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjEiIGhlaWdodD0iMjkiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHBhdGggZD0iTTE0IDMuNWEzLjUgMy41IDAgMSAxIDcgMHYyMmEzLjUgMy41IDAgMSAxLTcgMHYtMjJaIiBmaWxsPSIjZmZmIi8+CiAgPHJlY3Qgd2lkdGg9IjciIGhlaWdodD0iMjkiIHJ4PSIzLjUiIGZpbGw9IiNmZmYiLz4KPC9zdmc+Cg==", Mt = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjciIGhlaWdodD0iMjUiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHJlY3QgeD0iLjIxIiB3aWR0aD0iMjYiIGhlaWdodD0iMjUiIHJ4PSI1IiBmaWxsPSIjZmZmIi8+Cjwvc3ZnPgo=", Lt = it(
({

@@ -315,163 +314,164 @@ controls: {

duration: r,
audioSrc: c,
currentAudioTime: s,
bufferFromRecordedBlob: g,
togglePauseResume: v,
startRecording: M,
stopRecording: u,
saveAudioFile: I,
isAvailableRecordedAudio: f,
isPausedRecordedAudio: w,
isPausedRecording: L,
isProcessingRecordedAudio: o,
isCleared: p,
formattedDuration: H,
formattedRecordingTime: m,
formattedRecordedAudioCurrentTime: x,
clearCanvas: j,
setCurrentAudioTime: Q,
_setIsProcessingAudioOnComplete: F,
_setIsProcessingOnResize: B
currentAudioTime: c,
audioSrc: h,
bufferFromRecordedBlob: d,
togglePauseResume: m,
startRecording: I,
stopRecording: f,
saveAudioFile: z,
isAvailableRecordedAudio: g,
isPausedRecordedAudio: N,
isPausedRecording: w,
isProcessingRecordedAudio: s,
isCleared: M,
formattedDuration: R,
formattedRecordingTime: l,
formattedRecordedAudioCurrentTime: Z,
clearCanvas: b,
setCurrentAudioTime: ze,
_setIsProcessingAudioOnComplete: x,
_setIsProcessingOnResize: W,
_handleAudioPlaybackOnLoad: T
},
width: D = "100%",
width: V = "100%",
height: U = 200,
speed: V = 3,
backgroundColor: $ = "transparent",
mainBarColor: R = "#FFFFFF",
speed: ge = 3,
backgroundColor: H = "transparent",
mainBarColor: $ = "#FFFFFF",
secondaryBarColor: q = "#5e5e5e",
barWidth: P = 2,
gap: fe = 1,
rounded: ne = 5,
isControlPanelShown: ie = !0,
isDownloadAudioButtonShown: ce = !1,
animateCurrentPick: X = !0,
fullscreen: oe = !1,
onlyRecording: W = !1,
isDefaultUIShown: se = !0,
defaultMicrophoneIconColor: Te = R,
defaultAudioWaveIconColor: Le = R,
mainContainerClassName: be,
canvasContainerClassName: ze,
isProgressIndicatorShown: Y = !W,
progressIndicatorClassName: d,
isProgressIndicatorTimeShown: C = !0,
progressIndicatorTimeClassName: K,
isProgressIndicatorOnHoverShown: ae = !W,
progressIndicatorOnHoverClassName: G,
isProgressIndicatorTimeOnHoverShown: T = !0,
progressIndicatorTimeOnHoverClassName: _,
isAudioProcessingTextShown: h = !0,
audioProcessingTextClassName: pe,
controlButtonsClassName: we
}, Se) => {
const [ge, Ee] = l(0), [b, Ce] = l(0), [ee, ue] = l(0), [he, _e] = l(0), [le, te] = l(!1), [Ne, ye] = l(window.innerWidth), [re, Ae] = l(!1), i = Ne < 768, A = Math.trunc(V), S = Math.trunc(fe), N = Math.trunc(
i && S > 0 ? P + 1 : P
), O = N + S * N, z = y(null), je = y([]), me = y(A), ve = y(N), Je = y(N), Me = y(null), Qe = zt(Ne), {
result: Ie,
setResult: Ve,
run: qe
} = lt({
fn: st,
barWidth: k = 2,
gap: Me = 1,
rounded: ce = 5,
isControlPanelShown: X = !0,
isDownloadAudioButtonShown: oe = !1,
animateCurrentPick: K = !0,
fullscreen: se = !1,
onlyRecording: J = !1,
isDefaultUIShown: ae = !0,
defaultMicrophoneIconColor: xe = $,
defaultAudioWaveIconColor: ye = $,
mainContainerClassName: He,
canvasContainerClassName: je,
isProgressIndicatorShown: pe = !J,
progressIndicatorClassName: Ie,
isProgressIndicatorTimeShown: Y = !0,
progressIndicatorTimeClassName: v,
isProgressIndicatorOnHoverShown: E = !J,
progressIndicatorOnHoverClassName: ee,
isProgressIndicatorTimeOnHoverShown: ue = !0,
progressIndicatorTimeOnHoverClassName: B,
isAudioProcessingTextShown: A = !0,
audioProcessingTextClassName: C,
controlButtonsClassName: o
}, De) => {
const [Le, $e] = u(0), [_, Oe] = u(0), [he, Te] = u(0), [we, Se] = u(0), [G, le] = u(!1), [Ee, Ce] = u(window.innerWidth), [te, me] = u(!1), _e = Ee < 768, Ne = Math.trunc(ge), ve = Math.trunc(Me), O = Math.trunc(
_e && ve > 0 ? k + 1 : k
), re = O + ve * O, p = S(null), i = S([]), L = S(Ne), y = S(O), j = S(O), ne = S(null), F = De, Fe = ft(Ee), {
result: ie,
setResult: de,
run: Ke
} = vt({
fn: ut,
initialValue: [],
onMessageReceived: Ke
}), Xe = mt(xe);
Z(() => {
xe();
const E = () => {
Qe.current !== window.innerWidth && (f ? (ye(window.innerWidth), B(!0), Ae(!0), Xe()) : (ye(window.innerWidth), xe()));
onMessageReceived: tt
}), et = dt(Re);
P(() => {
Re();
const D = () => {
Fe.current !== window.innerWidth && (g ? (Ce(window.innerWidth), W(!0), me(!0), et()) : (Ce(window.innerWidth), Re()));
};
return window.addEventListener("resize", E), () => {
window.removeEventListener("resize", E);
return window.addEventListener("resize", D), () => {
window.removeEventListener("resize", D);
};
}, [D, f]), We(() => {
z.current && ((me.current >= A || !e.length) && (me.current = 0, ct({
}, [V, g]), qe(() => {
p.current && ((L.current >= Ne || !e.length) && (L.current = 0, st({
audioData: e,
unit: O,
index: ve,
index2: Je,
canvas: z.current,
picks: je.current,
unit: re,
index: y,
index2: j,
canvas: p.current,
picks: i.current,
isRecordingInProgress: t,
isPausedRecording: L,
backgroundColor: $,
mainBarColor: R,
isPausedRecording: w,
backgroundColor: H,
mainBarColor: $,
secondaryBarColor: q,
barWidth: N,
rounded: ne,
animateCurrentPick: X,
fullscreen: oe
})), me.current += 1);
barWidth: O,
rounded: ce,
animateCurrentPick: K,
fullscreen: se
})), L.current += 1);
}, [
z.current,
p.current,
e,
N,
O,
H,
$,
R,
q,
ne,
oe,
ce,
se,
he
]), Z(() => {
var E, k;
if (f)
return le ? (E = z.current) == null || E.addEventListener("mouseleave", Re) : (k = z.current) == null || k.addEventListener("mouseenter", $e), () => {
var J, Fe;
le ? (J = z.current) == null || J.removeEventListener(
ae,
we
]), P(() => {
var D, Q;
if (g)
return G ? (D = p.current) == null || D.addEventListener("mouseleave", Ye) : (Q = p.current) == null || Q.addEventListener("mouseenter", Ue), () => {
var Ae, Pe;
G ? (Ae = p.current) == null || Ae.removeEventListener(
"mouseleave",
Re
) : (Fe = z.current) == null || Fe.removeEventListener(
Ye
) : (Pe = p.current) == null || Pe.removeEventListener(
"mouseenter",
$e
Ue
);
};
}, [le, f]), Z(() => {
var k;
if (!g || !z.current || t || re)
}, [G, g]), P(() => {
var Q;
if (!d || !p.current || t || te)
return;
if (W) {
j();
if (J) {
b();
return;
}
je.current = [];
const E = g.getChannelData(0);
return qe({
bufferData: E,
height: ee,
width: he,
barWidth: N,
gap: S
}), (k = z.current) == null || k.addEventListener(
i.current = [];
const D = d.getChannelData(0);
return Ke({
bufferData: D,
height: he,
width: we,
barWidth: O,
gap: ve
}), (Q = p.current) == null || Q.addEventListener(
"mousemove",
Oe
Be
), () => {
var J;
(J = z.current) == null || J.removeEventListener(
var Ae;
(Ae = p.current) == null || Ae.removeEventListener(
"mousemove",
Oe
Be
);
};
}, [
g,
b,
ee,
fe,
P,
re
]), Z(() => {
if (!(W || !(Ie != null && Ie.length) || !z.current || o)) {
if (p) {
Ve([]);
d,
_,
he,
Me,
k,
te
]), P(() => {
if (!(J || !(ie != null && ie.length) || !p.current || s)) {
if (M) {
de([]);
return;
}
nt({
barsData: Ie,
canvas: z.current,
barWidth: N,
gap: S,
backgroundColor: $,
mainBarColor: R,
ct({
barsData: ie,
canvas: p.current,
barWidth: O,
gap: ve,
backgroundColor: H,
mainBarColor: $,
secondaryBarColor: q,
currentAudioTime: s,
rounded: ne,
currentAudioTime: c,
rounded: ce,
duration: r

@@ -481,51 +481,50 @@ });

}, [
Ie,
s,
p,
ne,
ie,
c,
M,
ce,
H,
$,
R,
q
]), Z(() => {
o && z.current && He({
canvas: z.current,
backgroundColor: $
]), P(() => {
s && p.current && Ze({
canvas: p.current,
backgroundColor: H
});
}, [o]);
function xe() {
if (!Me.current || !z.current)
}, [s]);
function Re() {
if (!ne.current || !p.current)
return;
me.current = A;
const E = Math.trunc(
Me.current.clientHeight * window.devicePixelRatio / 2
L.current = Ne;
const D = Math.trunc(
ne.current.clientHeight * window.devicePixelRatio / 2
) * 2;
Ce(Me.current.clientWidth), ue(E), _e(
Oe(ne.current.clientWidth), Te(D), Se(
Math.round(
Me.current.clientWidth * window.devicePixelRatio
ne.current.clientWidth * window.devicePixelRatio
)
), Ae(!1);
), me(!1);
}
function Ke() {
B(!1), F(!1);
function tt() {
W(!1), x(!1), F != null && F.current && (F.current.src = h, T());
}
const $e = () => {
te(!0);
}, Re = () => {
te(!1);
}, Oe = (E) => {
Ee(E.offsetX);
}, et = (E) => {
const k = Se;
if (k.current && z.current) {
const J = r / b * (E.clientX - z.current.getBoundingClientRect().left);
k.current.currentTime = J, Q(J);
const Ue = () => {
le(!0);
}, Ye = () => {
le(!1);
}, Be = (D) => {
$e(D.offsetX);
}, rt = (D) => {
if (F != null && F.current && p.current) {
const Q = r / _ * (D.clientX - p.current.getBoundingClientRect().left);
F.current.currentTime = Q, ze(Q);
}
}, Ze = s / r * b;
return /* @__PURE__ */ de("div", { className: `voice-visualizer ${be ?? ""}`, children: [
/* @__PURE__ */ de(
}, Ge = c / r * _;
return /* @__PURE__ */ fe("div", { className: `voice-visualizer ${He ?? ""}`, children: [
/* @__PURE__ */ fe(
"div",
{
className: `voice-visualizer__canvas-container ${ze ?? ""}`,
ref: Me,
style: { width: Ye(D) },
className: `voice-visualizer__canvas-container ${je ?? ""}`,
ref: ne,
style: { width: ke(V) },
children: [

@@ -535,9 +534,9 @@ /* @__PURE__ */ a(

{
ref: z,
width: he,
height: ee,
onClick: et,
ref: p,
width: we,
height: he,
onClick: rt,
style: {
height: Ye(U),
width: b
height: ke(U),
width: _
},

@@ -547,14 +546,14 @@ children: "Your browser does not support HTML5 Canvas."

),
se && p && /* @__PURE__ */ de(Ue, { children: [
/* @__PURE__ */ a(Ge, { color: Le }),
/* @__PURE__ */ a(Ge, { color: Le, reflect: !0 }),
ae && M && /* @__PURE__ */ fe(We, { children: [
/* @__PURE__ */ a(Je, { color: ye }),
/* @__PURE__ */ a(Je, { color: ye, reflect: !0 }),
/* @__PURE__ */ a(
"button",
{
onClick: M,
onClick: I,
className: "voice-visualizer__canvas-microphone-btn",
children: /* @__PURE__ */ a(
vt,
zt,
{
color: Te,
color: xe,
stroke: 0.5,

@@ -567,25 +566,25 @@ className: "voice-visualizer__canvas-microphone-icon"

] }),
h && o && /* @__PURE__ */ a(
A && s && /* @__PURE__ */ a(
"p",
{
className: `voice-visualizer__canvas-audio-processing ${pe ?? ""}`,
style: { color: R },
className: `voice-visualizer__canvas-audio-processing ${C ?? ""}`,
style: { color: $ },
children: "Processing Audio..."
}
),
le && f && !o && !i && ae && /* @__PURE__ */ a(
G && g && !s && !_e && E && /* @__PURE__ */ a(
"div",
{
className: `voice-visualizer__progress-indicator-hovered ${G ?? ""}`,
className: `voice-visualizer__progress-indicator-hovered ${ee ?? ""}`,
style: {
left: ge
left: Le
},
children: T && /* @__PURE__ */ a(
children: ue && /* @__PURE__ */ a(
"p",
{
className: `voice-visualizer__progress-indicator-hovered-time
${b - ge < 70 ? "voice-visualizer__progress-indicator-hovered-time-left" : ""}
${_ ?? ""}`,
children: ke(
r / b * ge
${_ - Le < 70 ? "voice-visualizer__progress-indicator-hovered-time-left" : ""}
${B ?? ""}`,
children: Xe(
r / _ * Le
)

@@ -596,14 +595,14 @@ }

),
Y && f && !o && r ? /* @__PURE__ */ a(
pe && g && !s && r ? /* @__PURE__ */ a(
"div",
{
className: `voice-visualizer__progress-indicator ${d ?? ""}`,
className: `voice-visualizer__progress-indicator ${Ie ?? ""}`,
style: {
left: Ze < b - 1 ? Ze : b - 1
left: Ge < _ - 1 ? Ge : _ - 1
},
children: C && /* @__PURE__ */ a(
children: Y && /* @__PURE__ */ a(
"p",
{
className: `voice-visualizer__progress-indicator-time ${b - s * b / r < 70 ? "voice-visualizer__progress-indicator-time-left" : ""} ${K ?? ""}`,
children: x
className: `voice-visualizer__progress-indicator-time ${_ - c * _ / r < 70 ? "voice-visualizer__progress-indicator-time-left" : ""} ${v ?? ""}`,
children: Z
}

@@ -616,18 +615,18 @@ )

),
ie && /* @__PURE__ */ de(Ue, { children: [
/* @__PURE__ */ de("div", { className: "voice-visualizer__audio-info-container", children: [
t && /* @__PURE__ */ a("p", { className: "voice-visualizer__audio-info-time", children: m }),
r && !o ? /* @__PURE__ */ a("p", { children: H }) : null
X && /* @__PURE__ */ fe(We, { children: [
/* @__PURE__ */ fe("div", { className: "voice-visualizer__audio-info-container", children: [
t && /* @__PURE__ */ a("p", { className: "voice-visualizer__audio-info-time", children: l }),
r && !s ? /* @__PURE__ */ a("p", { children: R }) : null
] }),
/* @__PURE__ */ de("div", { className: "voice-visualizer__buttons-container", children: [
/* @__PURE__ */ fe("div", { className: "voice-visualizer__buttons-container", children: [
t && /* @__PURE__ */ a(
"button",
{
className: `voice-visualizer__btn-left ${L ? "voice-visualizer__btn-left-microphone" : ""}`,
onClick: v,
className: `voice-visualizer__btn-left ${w ? "voice-visualizer__btn-left-microphone" : ""}`,
onClick: m,
children: /* @__PURE__ */ a(
"img",
{
src: L ? Be : Pe,
alt: L ? "Play" : "Pause"
src: w ? Qe : Ve,
alt: w ? "Play" : "Pause"
}

@@ -637,13 +636,13 @@ )

),
!p && /* @__PURE__ */ a(
!M && /* @__PURE__ */ a(
"button",
{
className: `voice-visualizer__btn-left ${t ? "voice-visualizer__visually-hidden" : ""}`,
onClick: v,
disabled: o,
onClick: m,
disabled: s,
children: /* @__PURE__ */ a(
"img",
{
src: w ? dt : Pe,
alt: w ? "Play" : "Pause"
src: N ? gt : Ve,
alt: N ? "Play" : "Pause"
}

@@ -653,8 +652,8 @@ )

),
p && /* @__PURE__ */ a(
M && /* @__PURE__ */ a(
"button",
{
className: "voice-visualizer__btn-center",
onClick: M,
children: /* @__PURE__ */ a("img", { src: Be, alt: "Microphone" })
onClick: I,
children: /* @__PURE__ */ a("img", { src: Qe, alt: "Microphone" })
}

@@ -666,21 +665,21 @@ ),

className: `voice-visualizer__btn-center voice-visualizer__btn-center-pause ${t ? "" : "voice-visualizer__visually-hidden"}`,
onClick: u,
children: /* @__PURE__ */ a("img", { src: ft, alt: "Stop" })
onClick: f,
children: /* @__PURE__ */ a("img", { src: Mt, alt: "Stop" })
}
),
!p && /* @__PURE__ */ a(
!M && /* @__PURE__ */ a(
"button",
{
onClick: j,
className: `voice-visualizer__btn ${we ?? ""}`,
disabled: o,
onClick: b,
className: `voice-visualizer__btn ${o ?? ""}`,
disabled: s,
children: "Clear"
}
),
ce && n && /* @__PURE__ */ a(
oe && n && /* @__PURE__ */ a(
"button",
{
onClick: I,
className: `voice-visualizer__btn ${we ?? ""}`,
disabled: o,
onClick: z,
className: `voice-visualizer__btn ${o ?? ""}`,
disabled: s,
children: "Download Audio"

@@ -690,16 +689,7 @@ }

] })
] }),
f && /* @__PURE__ */ a(
"audio",
{
ref: Se,
src: c,
controls: !0,
style: { display: "none" }
}
)
] })
] });
}
);
function Lt({
function wt({
onStartRecording: e,

@@ -710,40 +700,54 @@ onStopRecording: t,

onClearCanvas: c,
onEndAudioPlayback: s,
onStartAudioPlayback: g,
onPausedAudioPlayback: v,
onResumedAudioPlayback: M
onEndAudioPlayback: h,
onStartAudioPlayback: d,
onPausedAudioPlayback: m,
onResumedAudioPlayback: I,
onErrorPlayingAudio: f
} = {}) {
const [u, I] = l(!1), [f, w] = l(!1), [L, o] = l(null), [p, H] = l(new Uint8Array(0)), [m, x] = l(!1), [j, Q] = l(null), [F, B] = l(null), [D, U] = l(0), [V, $] = l(0), [R, q] = l(0), [P, fe] = l(""), [ne, ie] = l(!0), [ce, X] = l(0), [oe, W] = l(!0), [se, Te] = l(!1), [Le, be] = l(!1), [ze, Y] = l(null), d = y(null), C = y(null), K = y(null), ae = y(null), G = y(null), T = y(null), _ = y(null), h = y(null), pe = !!(F && !m), we = ut(R), Se = ot(D), ge = ke(ce), Ee = Le || m;
Z(() => {
if (!u || f)
const [z, g] = u(!1), [N, w] = u(!1), [s, M] = u(null), [R, l] = u(new Uint8Array(0)), [Z, b] = u(!1), [ze, x] = u(null), [W, T] = u(null), [V, U] = u(0), [ge, H] = u(0), [$, q] = u(0), [k, Me] = u(""), [ce, X] = u(!0), [oe, K] = u(0), [se, J] = u(!0), [ae, xe] = u(!1), [ye, He] = u(!1), [je, pe] = u(!1), [Ie, Y] = u(null), v = S(null), E = S(null), ee = S(null), ue = S(null), B = S(null), A = S(null), C = S(null), o = S(null), De = !!(W && !Z), Le = lt($), $e = at(V), _ = Xe(oe), Oe = ye || Z;
P(() => {
if (!z || N)
return;
const A = setInterval(() => {
const S = performance.now();
U((N) => N + (S - V)), $(S);
const L = setInterval(() => {
const y = performance.now();
U((j) => j + (y - ge)), H(y);
}, 1e3);
return () => clearInterval(A);
}, [V, f, u]), Z(() => {
if (!j || j.size === 0)
return () => clearInterval(L);
}, [ge, N, z]), P(() => {
if (Ie) {
te();
return;
(async () => {
var A;
}
}, [Ie]), P(() => () => {
C.current && cancelAnimationFrame(C.current), A.current && cancelAnimationFrame(A.current), B.current && B.current.disconnect(), E.current && E.current.state !== "closed" && E.current.close(), o != null && o.current && o.current.removeEventListener("ended", re), v.current && v.current.removeEventListener(
"dataavailable",
G
);
}, []), P(() => (!se && !ae && window.addEventListener("beforeunload", he), () => {
window.removeEventListener("beforeunload", he);
}), [se, ae]);
const he = (i) => {
i.preventDefault(), i.returnValue = "";
}, Te = async (i) => {
var L;
if (!(!i || i.size === 0))
try {
Y(null);
const S = new Blob([j], {
type: (A = d.current) == null ? void 0 : A.mimeType
}), N = URL.createObjectURL(S);
N && fe(N);
const O = await j.arrayBuffer(), z = new AudioContext(), je = (ve) => {
B(ve), q(ve.duration - 0.06);
}, me = (ve) => {
Y(ve);
const y = new Blob([i], {
type: (L = v.current) == null ? void 0 : L.mimeType
}), j = URL.createObjectURL(y);
j && Me(j);
const ne = await i.arrayBuffer(), F = new AudioContext(), Fe = (de) => {
T(de), q(de.duration - 0.06);
}, ie = (de) => {
Y(de);
};
z.decodeAudioData(
O,
je,
me
F.decodeAudioData(
ne,
Fe,
ie
);
} catch (S) {
if (console.error("Error processing the audio blob:", S), S instanceof Error) {
Y(S);
} catch (y) {
if (console.error("Error processing the audio blob:", y), y instanceof Error) {
Y(y);
return;

@@ -753,26 +757,10 @@ }

}
})();
}, [j]), Z(() => {
if (ze) {
te();
return;
}
}, [ze]), Z(() => () => {
_.current && cancelAnimationFrame(_.current), G.current && G.current.disconnect(), C.current && C.current.state !== "closed" && C.current.close(), T.current && cancelAnimationFrame(T.current), h != null && h.current && h.current.removeEventListener("ended", re), d.current && d.current.removeEventListener(
"dataavailable",
ue
);
}, []), Z(() => (!oe && !se && window.addEventListener("beforeunload", b), () => {
window.removeEventListener("beforeunload", b);
}), [oe, se]);
const b = (i) => {
i.preventDefault(), i.returnValue = "";
}, Ce = () => {
}, we = () => {
navigator.mediaDevices.getUserMedia({ audio: !0 }).then((i) => {
te(), W(!1), $(performance.now()), I(!0), o(i), C.current = new window.AudioContext(), K.current = C.current.createAnalyser(), ae.current = new Uint8Array(
K.current.frequencyBinCount
), G.current = C.current.createMediaStreamSource(i), G.current.connect(K.current), d.current = new MediaRecorder(i), d.current.addEventListener(
te(), J(!1), H(performance.now()), g(!0), M(i), E.current = new window.AudioContext(), ee.current = E.current.createAnalyser(), ue.current = new Uint8Array(
ee.current.frequencyBinCount
), B.current = E.current.createMediaStreamSource(i), B.current.connect(ee.current), v.current = new MediaRecorder(i), v.current.addEventListener(
"dataavailable",
ue
), d.current.start(), ee();
G
), v.current.start(), Se();
}).catch((i) => {

@@ -785,82 +773,99 @@ if (console.error("Error starting audio recording:", i), i instanceof Error) {

});
}, ee = () => {
K.current.getByteTimeDomainData(ae.current), H(new Uint8Array(ae.current)), T.current = requestAnimationFrame(ee);
}, ue = (i) => {
d.current && Q(i.data);
}, he = () => {
_.current && cancelAnimationFrame(_.current), h.current && (X(h.current.currentTime), _.current = requestAnimationFrame(he));
}, _e = () => {
u || (e && e(), Ce());
}, Se = () => {
ee.current.getByteTimeDomainData(ue.current), l(new Uint8Array(ue.current)), A.current = requestAnimationFrame(Se);
}, G = (i) => {
v.current && (o.current = new Audio(), x(i.data), Te(i.data));
}, le = () => {
u && (t && t(), x(!0), I(!1), U(0), w(!1), T.current && cancelAnimationFrame(T.current), G.current && G.current.disconnect(), C.current && C.current.state !== "closed" && C.current.close(), L == null || L.getTracks().forEach((i) => i.stop()), d.current && (d.current.stop(), d.current.removeEventListener(
o.current && (K(o.current.currentTime), C.current = requestAnimationFrame(le));
}, Ee = () => {
z || (e && e(), we());
}, Ce = () => {
z && (t && t(), b(!0), g(!1), U(0), w(!1), A.current && cancelAnimationFrame(A.current), B.current && B.current.disconnect(), E.current && E.current.state !== "closed" && E.current.close(), s == null || s.getTracks().forEach((i) => i.stop()), v.current && (v.current.stop(), v.current.removeEventListener(
"dataavailable",
ue
G
)));
}, te = () => {
T.current && cancelAnimationFrame(T.current), h != null && h.current && h.current.removeEventListener("ended", re), _.current && cancelAnimationFrame(_.current), d.current && (d.current.removeEventListener(
A.current && cancelAnimationFrame(A.current), o != null && o.current && o.current.removeEventListener("ended", re), C.current && cancelAnimationFrame(C.current), v.current && (v.current.removeEventListener(
"dataavailable",
ue
), d.current.stop(), d.current = null), L == null || L.getTracks().forEach((i) => i.stop()), d.current = null, C.current = null, K.current = null, ae.current = null, G.current = null, T.current = null, _.current = null, c && c(), o(null), I(!1), x(!1), Q(null), B(null), U(0), $(0), q(0), fe(""), X(0), ie(!0), w(!1), H(new Uint8Array(0)), Y(null), W(!0);
G
), v.current.stop(), v.current = null), s == null || s.getTracks().forEach((i) => i.stop()), v.current = null, E.current = null, ee.current = null, ue.current = null, B.current = null, A.current = null, C.current = null, c && c(), M(null), g(!1), b(!1), x(null), T(null), U(0), H(0), q(0), Me(""), K(0), X(!0), w(!1), l(new Uint8Array(0)), Y(null), J(!0);
}, me = () => {
if (o.current && o.current.paused) {
const i = o.current.play();
i !== void 0 && i.catch((L) => {
console.error(L), f && f(
L instanceof Error ? L : new Error("Error playing audio")
);
});
}
}, _e = () => {
pe(!0), me();
}, Ne = (i) => {
i instanceof Blob && (te(), Te(!0), W(!1), x(!0), I(!1), U(0), w(!1), Q(i));
}, ye = () => {
var i, A, S, N;
if (u) {
w((O) => !O), ((i = d.current) == null ? void 0 : i.state) === "recording" ? (n && n(), (A = d.current) == null || A.pause(), U((O) => O + (performance.now() - V)), T.current && cancelAnimationFrame(T.current)) : (r && r(), (S = d.current) == null || S.resume(), $(performance.now()), T.current = requestAnimationFrame(ee));
i instanceof Blob && (te(), o.current = new Audio(), xe(!0), J(!1), b(!0), g(!1), U(0), w(!1), x(i), Te(i), me());
}, ve = () => {
var i, L, y;
if (z) {
w((j) => !j), ((i = v.current) == null ? void 0 : i.state) === "recording" ? (n && n(), (L = v.current) == null || L.pause(), U((j) => j + (performance.now() - ge)), A.current && cancelAnimationFrame(A.current)) : (r && r(), (y = v.current) == null || y.resume(), H(performance.now()), A.current = requestAnimationFrame(Se));
return;
}
if (h.current && pe)
if (_.current && cancelAnimationFrame(_.current), h.current.paused)
g && ce === 0 && g(), M && ce !== 0 && M(), h.current.addEventListener("ended", re), he(), ie(!1), (N = h.current) == null || N.play();
if (o.current && De)
if (o.current.paused)
d && oe === 0 && d(), I && oe !== 0 && I(), o.current.addEventListener("ended", re), X(!1), requestAnimationFrame(le), me();
else {
v && v(), h.current.removeEventListener("ended", re), h.current.pause(), ie(!0);
const O = h.current.currentTime;
X(O), h.current.currentTime = O;
C.current && cancelAnimationFrame(C.current), m && m(), o.current.removeEventListener("ended", re), o.current.pause(), X(!0);
const j = o.current.currentTime;
K(j), o.current.currentTime = j;
}
}, O = () => {
o.current && (je ? (d && d(), o.current.addEventListener("ended", re), X(!1), requestAnimationFrame(le), pe(!1)) : o.current.pause());
}, re = () => {
ie(!0), s && s(), h != null && h.current && (h.current.currentTime = 0, X(0));
}, Ae = () => {
var A;
if (!P)
C.current && cancelAnimationFrame(C.current), X(!0), h && h(), o != null && o.current && (o.current.currentTime = 0, K(0));
}, p = () => {
var L;
if (!k)
return;
const i = document.createElement("a");
i.href = P, i.download = `recorded_audio${at(
(A = d.current) == null ? void 0 : A.mimeType
)}`, document.body.appendChild(i), i.click(), document.body.removeChild(i), URL.revokeObjectURL(P);
i.href = k, i.download = `recorded_audio${ht(
(L = v.current) == null ? void 0 : L.mimeType
)}`, document.body.appendChild(i), i.click(), document.body.removeChild(i), URL.revokeObjectURL(k);
};
return {
isRecordingInProgress: u,
isPausedRecording: f,
audioData: p,
recordingTime: D,
isProcessingRecordedAudio: Ee,
recordedBlob: j,
mediaRecorder: d.current,
duration: R,
currentAudioTime: ce,
audioSrc: P,
isPausedRecordedAudio: ne,
bufferFromRecordedBlob: F,
isCleared: oe,
isAvailableRecordedAudio: pe,
isPreloadedBlob: se,
formattedDuration: we,
formattedRecordingTime: Se,
formattedRecordedAudioCurrentTime: ge,
audioRef: o,
isRecordingInProgress: z,
isPausedRecording: N,
audioData: R,
recordingTime: V,
isProcessingRecordedAudio: Oe,
recordedBlob: ze,
mediaRecorder: v.current,
duration: $,
currentAudioTime: oe,
audioSrc: k,
isPausedRecordedAudio: ce,
bufferFromRecordedBlob: W,
isCleared: se,
isAvailableRecordedAudio: De,
isPreloadedBlob: ae,
formattedDuration: Le,
formattedRecordingTime: $e,
formattedRecordedAudioCurrentTime: _,
isAutoplayPreloadedBlob: je,
setIsAutoplayPreloadedBlob: pe,
onClickAutoplayAudioOnLoad: _e,
setPreloadedAudioBlob: Ne,
startRecording: _e,
togglePauseResume: ye,
stopRecording: le,
saveAudioFile: Ae,
startRecording: Ee,
togglePauseResume: ve,
stopRecording: Ce,
saveAudioFile: p,
clearCanvas: te,
setCurrentAudioTime: X,
error: ze,
_setIsProcessingAudioOnComplete: x,
_setIsProcessingOnResize: be,
audioRef: h
setCurrentAudioTime: K,
error: Ie,
_setIsProcessingAudioOnComplete: b,
_setIsProcessingOnResize: He,
_handleAudioPlaybackOnLoad: O
};
}
export {
It as VoiceVisualizer,
Lt as useVoiceVisualizer
Lt as VoiceVisualizer,
wt as useVoiceVisualizer
};

@@ -7,2 +7,3 @@ import { Dispatch, MutableRefObject, SetStateAction } from "react";

export interface Controls {
audioRef: MutableRefObject<HTMLAudioElement | null>;
isRecordingInProgress: boolean;

@@ -27,2 +28,3 @@ isPausedRecording: boolean;

formattedRecordedAudioCurrentTime: string;
onClickAutoplayAudioOnLoad: () => void;
startRecording: () => void;

@@ -35,5 +37,7 @@ togglePauseResume: () => void;

error: Error | null;
isAutoplayPreloadedBlob: boolean;
setIsAutoplayPreloadedBlob: Dispatch<SetStateAction<boolean>>;
_handleAudioPlaybackOnLoad: () => void;
_setIsProcessingAudioOnComplete: Dispatch<SetStateAction<boolean>>;
_setIsProcessingOnResize: Dispatch<SetStateAction<boolean>>;
audioRef: MutableRefObject<HTMLAudioElement | null>;
}

@@ -110,2 +114,3 @@ export interface BarsData {

onResumedAudioPlayback?: () => void;
onErrorPlayingAudio?: (error: Error) => void;
}

@@ -112,0 +117,0 @@ export interface UseWebWorkerParams<T> {

{
"name": "react-voice-visualizer",
"private": false,
"version": "1.3.8",
"version": "1.5.14",
"type": "module",

@@ -6,0 +6,0 @@ "author": "Yurii Zarytskyi",

@@ -47,3 +47,3 @@ # react-voice-visualizer [Demo App](https://react-voice-visualizer.vercel.app/)

```jsx
```typescript jsx
import { useEffect } from "react";

@@ -73,7 +73,7 @@ import { useVoiceVisualizer, VoiceVisualizer } from "react-voice-visualizer";

console.log(error);
console.error(error);
}, [error]);
return (
<VoiceVisualizer controls={recorderControls} ref={audioRef}/>
<VoiceVisualizer ref={audioRef} controls={recorderControls} />
);

@@ -90,8 +90,9 @@ };

```
Example:
```jsx
```typescript jsx
import { useEffect } from 'react';
import { useVoiceVisualizer, VoiceVisualizer } from 'react-voice-visualizer';
const App = () => {
const App = ({audioBlob}) => {
const recorderControls = useVoiceVisualizer();

@@ -101,2 +102,4 @@ const {

setPreloadedAudioBlob,
togglePauseResume,
isAvailableRecordedAudio,
isPreloadedBlob,

@@ -109,3 +112,2 @@ error,

// Set the preloaded audioBlob when the component mounts
// Assuming 'audioBlob' is defined somewhere
if (audioBlob) {

@@ -120,11 +122,25 @@ setPreloadedAudioBlob(audioBlob);

console.log(error);
console.error(error);
}, [error]);
// Function to handle audio playback
const handleUserClickToPlayAudio = () => {
if (isAvailableRecordedAudio) {
togglePauseResume();
}
};
return (
<VoiceVisualizer
isControlPanelShown={false} // Set to 'false' in most cases, but should be determined based on the specific user's use case.
controls={recorderControls}
ref={audioRef}
/>
<div>
{/* Button to initiate audio playback */}
<button onClick={handleUserClickToPlayAudio}>Click to Toggle Play Audio</button>
<VoiceVisualizer
ref={audioRef}
controls={recorderControls}
isControlPanelShown={false} // Set to 'false' in most cases. You should use your own UI.
isDefaultUIShown={false} // Set to 'false' in most cases, but should be determined based on the specific user's use case.
/>
</div>
);

@@ -136,2 +152,77 @@ };

##### Autoplay Audio on Load
If you want the audio to autoplay as soon as it becomes available upon a user's click, please refer to the following example.
This example illustrates how to utilize the `onClickAutoplayAudioOnLoad` and `setPreloadedAudioBlob` functions to enable audio playback only when a user initiates it by clicking a button; otherwise, it may cause error during the execution of `audio.play()` that can be handled using the `onErrorPlayingAudio` callback, which you can pass as a parameter to `useVoiceVisualizer` hook.
By clicking the "Click to Play Audio" button, audio data is fetched and prepared for playback, ensuring compliance with autoplay policies. This approach guarantees that audio begins playing in response to a user's interaction, providing a seamless and policy-compliant audio experience.
Example:
```typescript jsx
import { useEffect } from 'react';
import { useVoiceVisualizer, VoiceVisualizer } from 'react-voice-visualizer';
const App = ({audioUrl}) => {
const recorderControls = useVoiceVisualizer();
const {
// ... (Extracted controls and states, if necessary)
setPreloadedAudioBlob,
onClickAutoplayAudioOnLoad, // Import the onClickAutoplayAudioOnLoad function
togglePauseResume,
isAvailableRecordedAudio,
isPreloadedBlob,
error,
audioRef,
} = recorderControls;
// Get and log any error when it occurs
useEffect(() => {
if (!error) return;
console.error(error);
}, [error]);
// Function to handle user click event for audio autoplay
const handleUserClickToAutoplayAudio = () => {
if (isAvailableRecordedAudio) {
togglePauseResume();
} else {
// Fetch the audio data and trigger autoplay upon user interaction
onClickAutoplayAudioOnLoad(); // Call the onClickAutoplayAudioOnLoad function
fetch(audioUrl)
.then((response) => {
if (!response.ok) {
throw new Error('The network response was not successful');
}
return response.blob()
})
.then((blob) => {
setPreloadedAudioBlob(blob); // Set blob
})
.catch((err) => {
// Handle errors, both network-related and those that occur during blob retrieval
console.error(err);
});
}
};
return (
<div>
{/* Button to initiate audio playback */}
<button onClick={handleUserClickToAutoplayAudio}>Click to Play Audio</button>
<VoiceVisualizer
ref={audioRef}
controls={recorderControls}
isControlPanelShown={false} // Set to 'false'. You should use your own UI.
isDefaultUIShown={false} // Set to 'false' in most cases, but should be determined based on the specific user's use case.
/>
</div>
);
};
export default App;
```
## Getting started

@@ -161,47 +252,52 @@

| Parameter | Type | Description |
|:-------------------------|:----------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `onStartRecording` | `() => void` | Callback function triggered when recording starts. |
| `onStopRecording` | `() => void` | Callback function triggered when recording stops. |
| `onPausedRecording` | `() => void` | Callback function triggered when recording is paused. |
| `onResumedRecording` | `() => void` | Callback function triggered when recording is resumed. |
| `onClearCanvas` | `() => void` | Callback function triggered when the canvas is cleared. |
| `onEndAudioPlayback` | `() => void` | Callback function triggered when audio playback ends. |
| `onStartAudioPlayback` | `() => void` | Callback function triggered when audio playback starts. |
| `onPausedAudioPlayback` | `() => void` | Callback function triggered when audio playback is paused. |
| `onResumedAudioPlayback` | `() => void` | Callback function triggered when audio playback is resumed. |
| Parameter | Type | Description |
|:-------------------------|:-------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------|
| `onStartRecording` | `() => void` | Callback function triggered when recording starts. |
| `onStopRecording` | `() => void` | Callback function triggered when recording stops. |
| `onPausedRecording` | `() => void` | Callback function triggered when recording is paused. |
| `onResumedRecording` | `() => void` | Callback function triggered when recording is resumed. |
| `onClearCanvas` | `() => void` | Callback function triggered when the canvas is cleared. |
| `onEndAudioPlayback` | `() => void` | Callback function triggered when audio playback ends. |
| `onStartAudioPlayback` | `() => void` | Callback function triggered when audio playback starts. |
| `onPausedAudioPlayback` | `() => void` | Callback function triggered when audio playback is paused. |
| `onResumedAudioPlayback` | `() => void` | Callback function triggered when audio playback is resumed. |
| `onErrorPlayingAudio` | `(error: Error) => void` | Callback function is invoked when an error occurs during the execution of `audio.play()`. It provides an opportunity to handle and respond to such error. |
##### Returns
| Returns | Type | Description |
|:---------------------------------------------------------------|:----------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `audioRef` | `MutableRefObject`<br/>`<HTMLAudioElement \| null>` | Reference to the audio element used for playback. |
| `isRecordingInProgress` | `boolean` | Indicates if audio recording is currently in progress. |
| `isPausedRecording` | `boolean` | Indicates if audio recording is currently paused. |
| `audioData` | `Uint8Array` | Audio data for real-time visualization. |
| `recordingTime` | `number` | Elapsed time during recording in miliseconds. |
| `mediaRecorder` | `MediaRecorder \| null` | MediaRecorder instance used for recording audio. |
| `duration` | `number` | Duration of the recorded audio in seconds. |
| `currentAudioTime` | `number` | Current playback time of the recorded audio in seconds. |
| `audioSrc` | `string` | Source URL of the recorded audio file for playback. |
| `isPausedRecordedAudio` | `boolean` | Indicates if recorded audio playback is paused. |
| `isProcessingRecordedAudio` | `boolean` | Indicates if the recorded audio is being processed and 'Processing Audio...' text shown. |
| `isCleared` | `boolean` | Indicates if the canvas has been cleared. |
| `isPreloadedBlob` | `boolean` | Indicates whether a blob of recorded audio data has been preloaded. |
| `isAvailableRecordedAudio` | `boolean` | Indicates whether recorded audi is available and not currently being processed. This return value can be used to check if it's an appropriate time to work with recorded audio data in your application. |
| `recordedBlob` | `Blob \| null` | Recorded audio data in Blob format. |
| `bufferFromRecordedBlob` | `AudioBuffer \| null` | Audio buffer from the recorded Blob. |
| `formattedDuration` | `string` | Formatted duration time in format 09:51m. |
| `formattedRecordingTime` | `string` | Formatted recording current time in format 09:51. |
| `formattedRecordedAudioCurrentTime` | `string` | Formatted recorded audio current time in format 09:51:1. |
| `setPreloadedAudioBlob` | `(audioBlob: Blob) => void` | This function allows you to load an existing audio blob for further processing, playback and visualization. The `audioBlob` parameter represents the recorded audio data stored in a Blob format. |
| `startRecording` | `() => void` | Function to start audio recording. |
| `togglePauseResume` | `() => void` | Function to toggle pause/resume during recording and playback of recorded audio. |
| `stopRecording` | `() => void` | Function to stop audio recording. |
| `saveAudioFile` | `() => void` | This function allows you to save the recorded audio as a `webm` file format. Please note that it supports saving audio only in the webm format. If you need to save the audio in a different format, you can use external libraries like FFmpeg to convert the Blob to your desired format. This flexibility allows you to tailor the output format according to your specific needs. |
| `clearCanvas` | `() => void` | Function to clear the visualization canvas. |
| `setCurrentAudioTime` | `Dispatch<SetStateAction<number>>` | Internal function to handle current audio time updates during playback. |
| `error` | `Error \| null` | Error object if any error occurred during recording or playback. |
| `_setIsProcessingAudioOnComplete` | `Dispatch<SetStateAction<boolean>>` | Internal function to set IsProcessingAudioOnComplete state. |
| `_setIsProcessingOnResize` | `Dispatch<SetStateAction<boolean>>` | Internal function to set IsProcessingOnResize state. |
| Returns | Type | Description |
|:----------------------------------------|:----------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `audioRef` | `MutableRefObject`<br/>`<HTMLAudioElement \| null>` | Reference to the audio element used for playback. |
| `isRecordingInProgress` | `boolean` | Indicates if audio recording is currently in progress. |
| `isPausedRecording` | `boolean` | Indicates if audio recording is currently paused. |
| `audioData` | `Uint8Array` | Audio data for real-time visualization. |
| `recordingTime` | `number` | Elapsed time during recording in milliseconds. |
| `mediaRecorder` | `MediaRecorder \| null` | MediaRecorder instance used for recording audio. |
| `duration` | `number` | Duration of the recorded audio in seconds. |
| `currentAudioTime` | `number` | Current playback time of the recorded audio in seconds. |
| `audioSrc` | `string` | Source URL of the recorded audio file for playback. |
| `isPausedRecordedAudio` | `boolean` | Indicates if recorded audio playback is paused. |
| `isProcessingRecordedAudio` | `boolean` | Indicates if the recorded audio is being processed and 'Processing Audio...' text shown. |
| `isCleared` | `boolean` | Indicates if the canvas has been cleared. |
| `isPreloadedBlob` | `boolean` | Indicates whether a blob of recorded audio data has been preloaded. |
| `isAvailableRecordedAudio` | `boolean` | Indicates whether recorded audi is available and not currently being processed. This return value can be used to check if it's an appropriate time to work with recorded audio data in your application. |
| `recordedBlob` | `Blob \| null` | Recorded audio data in Blob format. |
| `bufferFromRecordedBlob` | `AudioBuffer \| null` | Audio buffer from the recorded Blob. |
| `formattedDuration` | `string` | Formatted duration time in format 09:51m. |
| `formattedRecordingTime` | `string` | Formatted recording current time in format 09:51. |
| `formattedRecordedAudioCurrentTime` | `string` | Formatted recorded audio current time in format 09:51:1. |
| `setPreloadedAudioBlob` | `(audioBlob: Blob) => void` | This function allows you to load an existing audio blob for further processing, playback and visualization. The `audioBlob` parameter represents the recorded audio data stored in a Blob format. |
| `onClickAutoplayAudioOnLoad` | `() => void` | This function should be used in conjunction with the `setPreloadedAudioBlob` function when you load a completed audio blob. To start playing audio as soon as it becomes available, call this function but only in response to a `user's click event`. It ensures compliance with autoplay policies. |
| `startRecording` | `() => void` | Function to start audio recording. |
| `togglePauseResume` | `() => void` | Function to toggle pause/resume during recording and playback of recorded audio. |
| `stopRecording` | `() => void` | Function to stop audio recording. |
| `saveAudioFile` | `() => void` | This function allows you to save the recorded audio as a `webm` file format. Please note that it supports saving audio only in the webm format. If you need to save the audio in a different format, you can use external libraries like `FFmpeg` to convert the Blob to your desired format. This flexibility allows you to tailor the output format according to your specific needs. |
| `clearCanvas` | `() => void` | Function to clear the visualization canvas. |
| `isAutoplayPreloadedBlob` | `boolean` | State that indicates whether an `onClickAutoplayAudioOnLoad` function has been called. It is set to `false` once the audio data becomes available. |
| `setIsAutoplayPreloadedBlob` | `Dispatch<SetStateAction<boolean>>` | This function can handle the `isAutoplayPreloadedBlob` state and can be used to abort autoplay preloaded blob. |
| `setCurrentAudioTime` | `Dispatch<SetStateAction<number>>` | Internal function to handle current audio time updates during playback. |
| `error` | `Error \| null` | Error object if any error occurred during recording or playback. |
| `_setIsProcessingAudioOnComplete` | `Dispatch<SetStateAction<boolean>>` | Internal function to set `isProcessingAudioOnComplete` state. |
| `_setIsProcessingOnResize` | `Dispatch<SetStateAction<boolean>>` | Internal function to set `isProcessingOnResize` state. |
| `_handleAudioPlaybackOnLoad` | `() => void` | **(Do not use)** Internal function to handle autoplay. |

@@ -224,4 +320,4 @@ #### Load and visualize any Audio

|:--------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------|:-----------------------------|
| **`ref`** | A reference to the audio element - `audioRef` from the `useVoiceVisualizer` hook. | - | `React.RefObject` (Required) |
| **`controls`** | Provides the audio recording controls and states required for visualization. | - | `Controls` (Required) |
| **`ref`** | A reference to the audio element - `audioRef` from the `useVoiceVisualizer` hook. | - | `React.RefObject` (Required) |
| **`height`** | The height of the visualization canvas. | `200` | `string \| number` (Optional) |

@@ -228,0 +324,0 @@ | **`width`** | The width of the visualization canvas. | `100%` | `string \| number` (Optional) |

Sorry, the diff of this file is not supported yet