You're Invited:Meet the Socket Team at BlackHat and DEF CON in Las Vegas, Aug 4-6.RSVP
Socket
Book a DemoInstallSign in
Socket

react-voice-visualizer

Package Overview
Dependencies
Maintainers
1
Versions
44
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

react-voice-visualizer - npm Package Compare versions

Comparing version

to
1.7.4

4

dist/hooks/useLatest.d.ts
type UseLatestReturnType<T> = {
readonly current: T;
};
declare function useLatest<T>(value: T): UseLatestReturnType<T>;
export default useLatest;
export declare function useLatest<T>(value: T): UseLatestReturnType<T>;
export {};
import { Controls, useVoiceVisualizerParams } from "../types/types.ts";
declare function useVoiceVisualizer({ onStartRecording, onStopRecording, onPausedRecording, onResumedRecording, onClearCanvas, onEndAudioPlayback, onStartAudioPlayback, onPausedAudioPlayback, onResumedAudioPlayback, }?: useVoiceVisualizerParams): Controls;
declare function useVoiceVisualizer({ onStartRecording, onStopRecording, onPausedRecording, onResumedRecording, onClearCanvas, onEndAudioPlayback, onStartAudioPlayback, onPausedAudioPlayback, onResumedAudioPlayback, onErrorPlayingAudio, }?: useVoiceVisualizerParams): Controls;
export default useVoiceVisualizer;
(function(){"use strict";(e=>{try{if(typeof window>"u")return;var i=document.createElement("style");i.appendChild(document.createTextNode(e)),document.head.appendChild(i)}catch(o){console.error("vite-plugin-css-injected-by-js",o)}})(".voice-visualizer__buttons-container{display:flex;justify-content:center;align-items:center;column-gap:20px;row-gap:15px;flex-wrap:wrap;margin-bottom:40px}.voice-visualizer__btn-center{box-sizing:border-box;flex-shrink:0;width:60px;height:60px;padding:0;display:flex;justify-content:center;align-items:center;border-radius:50%;background-color:#fff;border:4px solid #c5c5c5;outline:none;cursor:pointer;transition:border-color .3s,background-color .3s}.voice-visualizer__btn-center:hover{background-color:#eaeaea}.voice-visualizer__btn-center>img{width:auto;height:50%;max-height:30px}.voice-visualizer__btn-center.voice-visualizer__btn-center-pause{background-color:#ff3030}.voice-visualizer__btn-center.voice-visualizer__btn-center-pause:hover{background-color:#ff4f4f}.voice-visualizer__btn-center.voice-visualizer__btn-center-pause>img{height:50%;max-height:16px}.voice-visualizer__btn-center:hover{border:4px solid #9f9f9f}.voice-visualizer__btn-left{box-sizing:border-box;flex-shrink:0;width:60px;height:60px;padding:0;display:flex;justify-content:center;align-items:center;border-radius:50%;background-color:#ff3030;border:4px solid #c5c5c5;outline:none;cursor:pointer;transition:border-color .3s,background-color .3s,opacity .3s}.voice-visualizer__btn-left:hover{background-color:#ff4f4f}.voice-visualizer__btn-left:disabled{opacity:.6;background-color:#ff3030}.voice-visualizer__btn-left.voice-visualizer__btn-left-microphone{background-color:#fff}.voice-visualizer__btn-left.voice-visualizer__btn-left-microphone>img{width:auto;height:50%;max-height:30px}.voice-visualizer__btn-left>img{width:auto;height:50%;max-height:16px}.voice-visualizer__btn-left:hover{border:4px solid #9f9f9f}.voice-visualizer__btn{box-sizing:border-box;min-width:100px;min-height:60px;padding:5px 20px;border-radius:40px;font-size:15px;background-color:#f0f0f0;transition:background-color .3s,opacity .3s}.voice-visualizer__btn:disabled{opacity:.8;background-color:#f0f0f0}.voice-visualizer__btn:hover{background-color:#bebebe}.voice-visualizer__canvas-container{position:relative;width:fit-content;margin:0 auto;overflow:hidden}.voice-visualizer__canvas-container canvas{display:block}.voice-visualizer__canvas-microphone-btn{position:absolute;top:50%;left:50%;width:auto;max-width:12%;min-width:24px;height:50%;max-height:100px;background-color:transparent;border:none;outline:none;transform:translate(-50%,-50%)}.voice-visualizer__canvas-microphone-icon{width:100%;height:100%;will-change:transform;transition:transform .3s}.voice-visualizer__canvas-microphone-btn:hover .voice-visualizer__canvas-microphone-icon{transform:scale(1.03)}.voice-visualizer__canvas-audio-wave-icon{position:absolute;top:50%;left:50%;width:auto;max-width:40%;height:40%;max-height:100px;transform:translate(-118%,-50%) scale(-1)}.voice-visualizer__canvas-audio-wave-icon2{transform:translate(18%,-50%)}.voice-visualizer__canvas-audio-processing{position:absolute;top:50%;left:50%;margin:0;transform:translate(-50%,-50%)}.voice-visualizer__progress-indicator-hovered{position:absolute;top:0;pointer-events:none;height:100%;width:1px;background-color:#85858599}.voice-visualizer__progress-indicator-hovered-time{position:absolute;top:3%;left:1px;width:fit-content;margin:0;padding:0 7px;opacity:.8;font-size:12px;border-radius:0 4px 4px 0;background-color:#575757;text-align:left}.voice-visualizer__progress-indicator-hovered-time.voice-visualizer__progress-indicator-hovered-time-left{left:unset;right:1px;border-radius:4px 0 0 4px}.voice-visualizer__progress-indicator{position:absolute;top:0;pointer-events:none;height:100%;width:1px;background-color:#efefef}.voice-visualizer__progress-indicator-time{position:absolute;top:3%;left:1px;width:fit-content;box-sizing:border-box;min-width:37px;margin:0;padding:0 7px;font-size:12px;border-radius:0 4px 4px 0;text-align:left;color:#000;font-weight:500;background-color:#efefef}.voice-visualizer__progress-indicator-time.voice-visualizer__progress-indicator-time-left{left:unset;right:1px;border-radius:4px 0 0 4px}.voice-visualizer__audio-info-container{box-sizing:border-box;height:55px;display:flex;align-items:center;justify-content:center;gap:30px}.voice-visualizer__audio-info-time{margin:15px 0;min-width:38px;text-align:left}.voice-visualizer__visually-hidden{position:absolute;width:1px;height:1px;margin:-1px;padding:0;border:4px solid #c5c5c5;white-space:nowrap;clip-path:inset(100%);clip:rect(0 0 0 0);overflow:hidden}")})();
import { jsx as a, jsxs as de, Fragment as Ue } from "react/jsx-runtime";
import { useState as l, useRef as y, useCallback as tt, useLayoutEffect as We, forwardRef as rt, useEffect as Z } from "react";
const He = ({
import { jsx as a, jsxs as ae, Fragment as Fe } from "react/jsx-runtime";
import { useState as l, useRef as N, useCallback as rt, useLayoutEffect as Be, forwardRef as nt, useEffect as Z } from "react";
const be = ({
canvas: e,
backgroundColor: t
}) => {
const n = e.height, r = e.width, c = Math.round(r / 2), s = e.getContext("2d");
return s ? (s.clearRect(0, 0, r, n), t !== "transparent" && (s.fillStyle = t, s.fillRect(0, 0, r, n)), { context: s, height: n, width: r, halfWidth: c }) : null;
}, De = ({
const n = e.height, r = e.width, c = Math.round(r / 2), u = e.getContext("2d");
return u ? (u.clearRect(0, 0, r, n), t !== "transparent" && (u.fillStyle = t, u.fillRect(0, 0, r, n)), { context: u, height: n, width: r, halfWidth: c }) : null;
}, je = ({
context: e,

@@ -16,7 +16,7 @@ color: t,

y: c,
w: s,
h: g
w: u,
h: M
}) => {
e.fillStyle = t, e.beginPath(), e.roundRect ? (e.roundRect(r, c, s, g, n), e.fill()) : e.fillRect(r, c, s, g);
}, nt = ({
e.fillStyle = t, e.beginPath(), e.roundRect ? (e.roundRect(r, c, u, M, n), e.fill()) : e.fillRect(r, c, u, M);
}, it = ({
barsData: e,

@@ -27,20 +27,20 @@ canvas: t,

backgroundColor: c,
mainBarColor: s,
secondaryBarColor: g,
currentAudioTime: v = 0,
rounded: M,
duration: u
mainBarColor: u,
secondaryBarColor: M,
currentAudioTime: m = 0,
rounded: I,
duration: v
}) => {
const I = He({ canvas: t, backgroundColor: c });
if (!I)
const d = be({ canvas: t, backgroundColor: c });
if (!d)
return;
const { context: f, height: w } = I, L = v / u;
e.forEach((o, p) => {
const H = p / e.length, m = L > H;
De({
context: f,
color: m ? g : s,
rounded: M,
x: p * (n + r * n),
y: w / 2 - o.max,
const { context: z, height: A } = d, S = m / v;
e.forEach((o, g) => {
const b = g / e.length, h = S > b;
je({
context: z,
color: h ? M : u,
rounded: I,
x: g * (n + r * n),
y: A / 2 - o.max,
h: o.max * 2,

@@ -51,3 +51,3 @@ w: n

};
function it({
function ct({
context: e,

@@ -58,15 +58,15 @@ color: t,

height: c,
barWidth: s
barWidth: u
}) {
De({
je({
context: e,
color: t,
rounded: n,
x: r / 2 + s / 2,
x: r / 2 + u / 2,
y: c / 2 - 1,
h: 2,
w: r - (r / 2 + s / 2)
w: r - (r / 2 + u / 2)
});
}
const ct = ({
const ot = ({
audioData: e,

@@ -77,61 +77,61 @@ unit: t,

canvas: c,
isRecordingInProgress: s,
isPausedRecording: g,
picks: v,
backgroundColor: M,
barWidth: u,
mainBarColor: I,
secondaryBarColor: f,
rounded: w,
animateCurrentPick: L,
isRecordingInProgress: u,
isPausedRecording: M,
picks: m,
backgroundColor: I,
barWidth: v,
mainBarColor: d,
secondaryBarColor: z,
rounded: A,
animateCurrentPick: S,
fullscreen: o
}) => {
const p = He({ canvas: c, backgroundColor: M });
if (!p)
const g = be({ canvas: c, backgroundColor: I });
if (!g)
return;
const { context: H, height: m, width: x, halfWidth: j } = p;
if (e != null && e.length && s) {
const F = Math.max(...e);
if (!g) {
if (r.current >= u) {
const { context: b, height: h, width: H, halfWidth: _ } = g;
if (e != null && e.length && u) {
const $ = Math.max(...e);
if (!M) {
if (r.current >= v) {
r.current = 0;
const D = (m - F / 258 * m) / m * 100, U = (-m + F / 258 * m * 2) / m * 100, V = n.current === u ? {
startY: D,
barHeight: U
const j = (h - $ / 258 * h) / h * 100, ee = (-h + $ / 258 * h * 2) / h * 100, F = n.current === v ? {
startY: j,
barHeight: ee
} : null;
n.current >= t ? n.current = u : n.current += u, v.length > (o ? x : j) / u && v.pop(), v.unshift(V);
n.current >= t ? n.current = v : n.current += v, m.length > (o ? H : _) / v && m.pop(), m.unshift(F);
}
r.current += 1;
}
!o && Q(), L && De({
context: H,
rounded: w,
color: I,
x: o ? x : j,
y: m - F / 258 * m,
h: -m + F / 258 * m * 2,
w: u
!o && ue(), S && je({
context: b,
rounded: A,
color: d,
x: o ? H : _,
y: h - $ / 258 * h,
h: -h + $ / 258 * h * 2,
w: v
});
let B = (o ? x : j) - r.current;
v.forEach((D) => {
D && De({
context: H,
color: I,
rounded: w,
x: B,
y: D.startY * m / 100 > m / 2 - 1 ? m / 2 - 1 : D.startY * m / 100,
h: D.barHeight * m / 100 > 2 ? D.barHeight * m / 100 : 2,
w: u
}), B -= u;
let U = (o ? H : _) - r.current;
m.forEach((j) => {
j && je({
context: b,
color: d,
rounded: A,
x: U,
y: j.startY * h / 100 > h / 2 - 1 ? h / 2 - 1 : j.startY * h / 100,
h: j.barHeight * h / 100 > 2 ? j.barHeight * h / 100 : 2,
w: v
}), U -= v;
});
} else
v.length = 0;
function Q() {
it({
context: H,
color: f,
rounded: w,
width: x,
height: m,
barWidth: u
m.length = 0;
function ue() {
ct({
context: b,
color: z,
rounded: A,
width: H,
height: h,
barWidth: v
});

@@ -153,3 +153,3 @@ }

).charAt(0)}`;
}, ot = (e) => {
}, st = (e) => {
const t = Math.floor(e / 1e3), n = Math.floor(t / 3600), r = Math.floor(t % 3600 / 60), c = t % 60;

@@ -169,3 +169,3 @@ return n > 0 ? `${String(n).padStart(2, "0")}:${String(r).padStart(

}
const st = ({
const at = ({
bufferData: e,

@@ -177,22 +177,22 @@ height: t,

}) => {
const s = n / (r + c * r), g = Math.floor(e.length / s), v = t / 2;
let M = [], u = 0;
for (let I = 0; I < s; I++) {
const f = [];
let w = 0;
for (let o = 0; o < g && I * g + o < e.length; o++) {
const p = e[I * g + o];
p > 0 && (f.push(p), w++);
const u = n / (r + c * r), M = Math.floor(e.length / u), m = t / 2;
let I = [], v = 0;
for (let d = 0; d < u; d++) {
const z = [];
let A = 0;
for (let o = 0; o < M && d * M + o < e.length; o++) {
const g = e[d * M + o];
g > 0 && (z.push(g), A++);
}
const L = f.reduce((o, p) => o + p, 0) / w;
L > u && (u = L), M.push({ max: L });
const S = z.reduce((o, g) => o + g, 0) / A;
S > v && (v = S), I.push({ max: S });
}
if (v * 0.95 > u * v) {
const I = v * 0.95 / u;
M = M.map((f) => ({
max: f.max > 0.01 ? f.max * I : 1
if (m * 0.95 > v * m) {
const d = m * 0.95 / v;
I = I.map((z) => ({
max: z.max > 0.01 ? z.max * d : 1
}));
}
return M;
}, at = (e) => {
return I;
}, ut = (e) => {
if (!e)

@@ -202,3 +202,3 @@ return "";

return t && t.length >= 2 ? `.${t[1]}` : "";
}, ut = (e) => {
}, ht = (e) => {
const t = Math.floor(e / 3600), n = Math.floor(e % 3600 / 60), r = e % 60, c = Math.floor(

@@ -215,3 +215,3 @@ (r - Math.floor(r)) * 1e3

).charAt(0)}${String(c).charAt(1)}s`;
}, ht = (e) => {
}, lt = (e) => {
onmessage = (t) => {

@@ -221,3 +221,3 @@ postMessage(e(t.data));

};
function lt({
function mt({
fn: e,

@@ -231,18 +231,18 @@ initialValue: t,

setResult: c,
run: (g) => {
const v = new Worker(
run: (M) => {
const m = new Worker(
// eslint-disable-next-line @typescript-eslint/restrict-template-expressions
URL.createObjectURL(new Blob([`(${ht})(${e})`]))
URL.createObjectURL(new Blob([`(${lt})(${e})`]))
);
v.onmessage = (M) => {
M.data && (c(M.data), n && n(), v.terminate());
}, v.onerror = (M) => {
console.error(M.message), v.terminate();
}, v.postMessage(g);
m.onmessage = (I) => {
I.data && (c(I.data), n && n(), m.terminate());
}, m.onerror = (I) => {
console.error(I.message), m.terminate();
}, m.postMessage(M);
}
};
}
const mt = (e, t = 250) => {
const n = y();
return tt(
const vt = (e, t = 250) => {
const n = N();
return rt(
// eslint-disable-next-line @typescript-eslint/no-explicit-any

@@ -258,3 +258,9 @@ (...r) => {

};
const vt = ({
function dt(e) {
const t = N(e);
return Be(() => {
t.current = e;
}, [e]), t;
}
const ft = ({
color: e = "#000000",

@@ -298,10 +304,3 @@ stroke: t = 2,

}
), Be = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjMiIGhlaWdodD0iMzMiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHBhdGggZD0iTTEuMSAxNi43MmMwIDMgLjk2IDUuOCAzLjYxIDcuOTVhOS45NiA5Ljk2IDAgMCAwIDYuNSAyLjE3bTAgMHY0LjM0aDQuMzQtOC42N200LjM0LTQuMzRjMi4zNSAwIDQuNDItLjQ4IDYuNS0yLjE3YTkuODcgOS44NyAwIDAgMCAzLjYxLTcuOTVNMTEuMjIgMS44MmMtMS40NSAwLTIuNS4zNy0zLjMuOTNhNS42IDUuNiAwIDAgMC0xLjg0IDIuNGMtLjkgMi4wNi0xLjEgNC43Ny0xLjEgNy4yNCAwIDIuNDYuMiA1LjE3IDEuMSA3LjI0YTUuNiA1LjYgMCAwIDAgMS44NCAyLjRjLjguNTUgMS44NS45MiAzLjMuOTIgMS40NCAwIDIuNS0uMzcgMy4yOS0uOTNhNS42IDUuNiAwIDAgMCAxLjg0LTIuNGMuOS0yLjA2IDEuMS00Ljc3IDEuMS03LjIzIDAtMi40Ny0uMi01LjE4LTEuMS03LjI0YTUuNiA1LjYgMCAwIDAtMS44NC0yLjQgNS41MiA1LjUyIDAgMCAwLTMuMy0uOTNaIiBzdHJva2U9IiMwMDAiIHN0cm9rZS1saW5lY2FwPSJyb3VuZCIgc3Ryb2tlLWxpbmVqb2luPSJyb3VuZCIvPgo8L3N2Zz4K", dt = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjYiIGhlaWdodD0iMjQiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHBhdGggZD0iTTE4Ljc1IDYuMTZjNC4zMSAyLjYgNi40NiAzLjkgNi40NiA1Ljg0IDAgMS45NS0yLjE1IDMuMjQtNi40NiA1Ljg0bC00Ljg0IDIuOTJjLTQuMzEgMi42LTYuNDYgMy44OS04LjA4IDIuOTItMS42Mi0uOTgtMS42Mi0zLjU3LTEuNjItOC43NlY5LjA4YzAtNS4xOSAwLTcuNzggMS42Mi04Ljc2IDEuNjItLjk3IDMuNzcuMzMgOC4wOCAyLjkybDQuODQgMi45MloiIGZpbGw9IiNmZmYiLz4KPC9zdmc+Cg==", Pe = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjEiIGhlaWdodD0iMjkiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHBhdGggZD0iTTE0IDMuNWEzLjUgMy41IDAgMSAxIDcgMHYyMmEzLjUgMy41IDAgMSAxLTcgMHYtMjJaIiBmaWxsPSIjZmZmIi8+CiAgPHJlY3Qgd2lkdGg9IjciIGhlaWdodD0iMjkiIHJ4PSIzLjUiIGZpbGw9IiNmZmYiLz4KPC9zdmc+Cg==", ft = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjciIGhlaWdodD0iMjUiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHJlY3QgeD0iLjIxIiB3aWR0aD0iMjYiIGhlaWdodD0iMjUiIHJ4PSI1IiBmaWxsPSIjZmZmIi8+Cjwvc3ZnPgo=";
function zt(e) {
const t = y(e);
return We(() => {
t.current = e;
}, [e]), t;
}
const It = rt(
), Pe = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjMiIGhlaWdodD0iMzMiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHBhdGggZD0iTTEuMSAxNi43MmMwIDMgLjk2IDUuOCAzLjYxIDcuOTVhOS45NiA5Ljk2IDAgMCAwIDYuNSAyLjE3bTAgMHY0LjM0aDQuMzQtOC42N200LjM0LTQuMzRjMi4zNSAwIDQuNDItLjQ4IDYuNS0yLjE3YTkuODcgOS44NyAwIDAgMCAzLjYxLTcuOTVNMTEuMjIgMS44MmMtMS40NSAwLTIuNS4zNy0zLjMuOTNhNS42IDUuNiAwIDAgMC0xLjg0IDIuNGMtLjkgMi4wNi0xLjEgNC43Ny0xLjEgNy4yNCAwIDIuNDYuMiA1LjE3IDEuMSA3LjI0YTUuNiA1LjYgMCAwIDAgMS44NCAyLjRjLjguNTUgMS44NS45MiAzLjMuOTIgMS40NCAwIDIuNS0uMzcgMy4yOS0uOTNhNS42IDUuNiAwIDAgMCAxLjg0LTIuNGMuOS0yLjA2IDEuMS00Ljc3IDEuMS03LjIzIDAtMi40Ny0uMi01LjE4LTEuMS03LjI0YTUuNiA1LjYgMCAwIDAtMS44NC0yLjQgNS41MiA1LjUyIDAgMCAwLTMuMy0uOTNaIiBzdHJva2U9IiMwMDAiIHN0cm9rZS1saW5lY2FwPSJyb3VuZCIgc3Ryb2tlLWxpbmVqb2luPSJyb3VuZCIvPgo8L3N2Zz4K", zt = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjYiIGhlaWdodD0iMjQiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHBhdGggZD0iTTE4Ljc1IDYuMTZjNC4zMSAyLjYgNi40NiAzLjkgNi40NiA1Ljg0IDAgMS45NS0yLjE1IDMuMjQtNi40NiA1Ljg0bC00Ljg0IDIuOTJjLTQuMzEgMi42LTYuNDYgMy44OS04LjA4IDIuOTItMS42Mi0uOTgtMS42Mi0zLjU3LTEuNjItOC43NlY5LjA4YzAtNS4xOSAwLTcuNzggMS42Mi04Ljc2IDEuNjItLjk3IDMuNzcuMzMgOC4wOCAyLjkybDQuODQgMi45MloiIGZpbGw9IiNmZmYiLz4KPC9zdmc+Cg==", We = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjEiIGhlaWdodD0iMjkiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHBhdGggZD0iTTE0IDMuNWEzLjUgMy41IDAgMSAxIDcgMHYyMmEzLjUgMy41IDAgMSAxLTcgMHYtMjJaIiBmaWxsPSIjZmZmIi8+CiAgPHJlY3Qgd2lkdGg9IjciIGhlaWdodD0iMjkiIHJ4PSIzLjUiIGZpbGw9IiNmZmYiLz4KPC9zdmc+Cg==", gt = "data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMjciIGhlaWdodD0iMjUiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CiAgPHJlY3QgeD0iLjIxIiB3aWR0aD0iMjYiIGhlaWdodD0iMjUiIHJ4PSI1IiBmaWxsPSIjZmZmIi8+Cjwvc3ZnPgo=", pt = nt(
({

@@ -313,110 +312,110 @@ controls: {

duration: r,
audioSrc: c,
currentAudioTime: s,
bufferFromRecordedBlob: g,
togglePauseResume: v,
startRecording: M,
stopRecording: u,
saveAudioFile: I,
isAvailableRecordedAudio: f,
isPausedRecordedAudio: w,
isPausedRecording: L,
currentAudioTime: c,
audioSrc: u,
bufferFromRecordedBlob: M,
togglePauseResume: m,
startRecording: I,
stopRecording: v,
saveAudioFile: d,
isAvailableRecordedAudio: z,
isPausedRecordedAudio: A,
isPausedRecording: S,
isProcessingRecordedAudio: o,
isCleared: p,
formattedDuration: H,
formattedRecordingTime: m,
formattedRecordedAudioCurrentTime: x,
clearCanvas: j,
setCurrentAudioTime: Q,
_setIsProcessingAudioOnComplete: F,
_setIsProcessingOnResize: B
isCleared: g,
formattedDuration: b,
formattedRecordingTime: h,
formattedRecordedAudioCurrentTime: H,
clearCanvas: _,
setCurrentAudioTime: ue,
_setIsProcessingAudioOnComplete: $,
_setIsProcessingOnResize: U
},
width: D = "100%",
height: U = 200,
speed: V = 3,
backgroundColor: $ = "transparent",
mainBarColor: R = "#FFFFFF",
secondaryBarColor: q = "#5e5e5e",
barWidth: P = 2,
gap: fe = 1,
rounded: ne = 5,
isControlPanelShown: ie = !0,
isDownloadAudioButtonShown: ce = !1,
animateCurrentPick: X = !0,
fullscreen: oe = !1,
onlyRecording: W = !1,
isDefaultUIShown: se = !0,
defaultMicrophoneIconColor: Te = R,
defaultAudioWaveIconColor: Le = R,
mainContainerClassName: be,
canvasContainerClassName: ze,
isProgressIndicatorShown: Y = !W,
progressIndicatorClassName: d,
isProgressIndicatorTimeShown: C = !0,
progressIndicatorTimeClassName: K,
isProgressIndicatorOnHoverShown: ae = !W,
progressIndicatorOnHoverClassName: G,
isProgressIndicatorTimeOnHoverShown: T = !0,
progressIndicatorTimeOnHoverClassName: _,
isAudioProcessingTextShown: h = !0,
audioProcessingTextClassName: pe,
controlButtonsClassName: we
}, Se) => {
const [ge, Ee] = l(0), [b, Ce] = l(0), [ee, ue] = l(0), [he, _e] = l(0), [le, te] = l(!1), [Ne, ye] = l(window.innerWidth), [re, Ae] = l(!1), i = Ne < 768, A = Math.trunc(V), S = Math.trunc(fe), N = Math.trunc(
i && S > 0 ? P + 1 : P
), O = N + S * N, z = y(null), je = y([]), me = y(A), ve = y(N), Je = y(N), Me = y(null), Qe = zt(Ne), {
result: Ie,
setResult: Ve,
run: qe
} = lt({
fn: st,
width: j = "100%",
height: ee = 200,
speed: F = 3,
backgroundColor: O = "transparent",
mainBarColor: D = "#FFFFFF",
secondaryBarColor: B = "#5e5e5e",
barWidth: te = 2,
gap: k = 1,
rounded: J = 5,
isControlPanelShown: ye = !0,
isDownloadAudioButtonShown: re = !1,
animateCurrentPick: ne = !0,
fullscreen: Y = !1,
onlyRecording: G = !1,
isDefaultUIShown: he = !0,
defaultMicrophoneIconColor: De = D,
defaultAudioWaveIconColor: ge = D,
mainContainerClassName: le,
canvasContainerClassName: Q,
isProgressIndicatorShown: p = !G,
progressIndicatorClassName: R,
isProgressIndicatorTimeShown: V = !0,
progressIndicatorTimeClassName: ie,
isProgressIndicatorOnHoverShown: q = !G,
progressIndicatorOnHoverClassName: C,
isProgressIndicatorTimeOnHoverShown: x = !0,
progressIndicatorTimeOnHoverClassName: s,
isAudioProcessingTextShown: Me = !0,
audioProcessingTextClassName: Te,
controlButtonsClassName: Ie
}, Ee) => {
const [me, pe] = l(0), [T, Ce] = l(0), [X, ve] = l(0), [ce, _e] = l(0), [oe, K] = l(!1), [we, Le] = l(window.innerWidth), [se, Se] = l(!1), i = we < 768, f = Math.trunc(F), E = Math.trunc(k), w = Math.trunc(
i && E > 0 ? te + 1 : te
), Ne = w + E * w, L = N(null), He = N([]), Ae = N(f), Je = N(w), Qe = N(w), de = N(null), P = Ee, Ve = dt(we), {
result: fe,
setResult: qe,
run: Xe
} = mt({
fn: at,
initialValue: [],
onMessageReceived: Ke
}), Xe = mt(xe);
onMessageReceived: et
}), Ke = vt(xe);
Z(() => {
xe();
const E = () => {
Qe.current !== window.innerWidth && (f ? (ye(window.innerWidth), B(!0), Ae(!0), Xe()) : (ye(window.innerWidth), xe()));
const y = () => {
Ve.current !== window.innerWidth && (z ? (Le(window.innerWidth), U(!0), Se(!0), Ke()) : (Le(window.innerWidth), xe()));
};
return window.addEventListener("resize", E), () => {
window.removeEventListener("resize", E);
return window.addEventListener("resize", y), () => {
window.removeEventListener("resize", y);
};
}, [D, f]), We(() => {
z.current && ((me.current >= A || !e.length) && (me.current = 0, ct({
}, [j, z]), Be(() => {
L.current && ((Ae.current >= f || !e.length) && (Ae.current = 0, ot({
audioData: e,
unit: O,
index: ve,
index2: Je,
canvas: z.current,
picks: je.current,
unit: Ne,
index: Je,
index2: Qe,
canvas: L.current,
picks: He.current,
isRecordingInProgress: t,
isPausedRecording: L,
backgroundColor: $,
mainBarColor: R,
secondaryBarColor: q,
barWidth: N,
rounded: ne,
animateCurrentPick: X,
fullscreen: oe
})), me.current += 1);
isPausedRecording: S,
backgroundColor: O,
mainBarColor: D,
secondaryBarColor: B,
barWidth: w,
rounded: J,
animateCurrentPick: ne,
fullscreen: Y
})), Ae.current += 1);
}, [
z.current,
L.current,
e,
N,
$,
R,
q,
ne,
oe,
se,
he
w,
O,
D,
B,
J,
Y,
he,
ce
]), Z(() => {
var E, k;
if (f)
return le ? (E = z.current) == null || E.addEventListener("mouseleave", Re) : (k = z.current) == null || k.addEventListener("mouseenter", $e), () => {
var J, Fe;
le ? (J = z.current) == null || J.removeEventListener(
var y, W;
if (z)
return oe ? (y = L.current) == null || y.addEventListener("mouseleave", Oe) : (W = L.current) == null || W.addEventListener("mouseenter", $e), () => {
var ze, Ue;
oe ? (ze = L.current) == null || ze.removeEventListener(
"mouseleave",
Re
) : (Fe = z.current) == null || Fe.removeEventListener(
Oe
) : (Ue = L.current) == null || Ue.removeEventListener(
"mouseenter",

@@ -426,51 +425,51 @@ $e

};
}, [le, f]), Z(() => {
var k;
if (!g || !z.current || t || re)
}, [oe, z]), Z(() => {
var W;
if (!M || !L.current || t || se)
return;
if (W) {
j();
if (G) {
_();
return;
}
je.current = [];
const E = g.getChannelData(0);
return qe({
bufferData: E,
height: ee,
width: he,
barWidth: N,
gap: S
}), (k = z.current) == null || k.addEventListener(
He.current = [];
const y = M.getChannelData(0);
return Xe({
bufferData: y,
height: X,
width: ce,
barWidth: w,
gap: E
}), (W = L.current) == null || W.addEventListener(
"mousemove",
Oe
Re
), () => {
var J;
(J = z.current) == null || J.removeEventListener(
var ze;
(ze = L.current) == null || ze.removeEventListener(
"mousemove",
Oe
Re
);
};
}, [
g,
b,
ee,
fe,
P,
re
M,
T,
X,
k,
te,
se
]), Z(() => {
if (!(W || !(Ie != null && Ie.length) || !z.current || o)) {
if (p) {
Ve([]);
if (!(G || !(fe != null && fe.length) || !L.current || o)) {
if (g) {
qe([]);
return;
}
nt({
barsData: Ie,
canvas: z.current,
barWidth: N,
gap: S,
backgroundColor: $,
mainBarColor: R,
secondaryBarColor: q,
currentAudioTime: s,
rounded: ne,
it({
barsData: fe,
canvas: L.current,
barWidth: w,
gap: E,
backgroundColor: O,
mainBarColor: D,
secondaryBarColor: B,
currentAudioTime: c,
rounded: J,
duration: r

@@ -480,51 +479,50 @@ });

}, [
Ie,
s,
p,
ne,
$,
R,
q
fe,
c,
g,
J,
O,
D,
B
]), Z(() => {
o && z.current && He({
canvas: z.current,
backgroundColor: $
o && L.current && be({
canvas: L.current,
backgroundColor: O
});
}, [o]);
function xe() {
if (!Me.current || !z.current)
if (!de.current || !L.current)
return;
me.current = A;
const E = Math.trunc(
Me.current.clientHeight * window.devicePixelRatio / 2
Ae.current = f;
const y = Math.trunc(
de.current.clientHeight * window.devicePixelRatio / 2
) * 2;
Ce(Me.current.clientWidth), ue(E), _e(
Ce(de.current.clientWidth), ve(y), _e(
Math.round(
Me.current.clientWidth * window.devicePixelRatio
de.current.clientWidth * window.devicePixelRatio
)
), Ae(!1);
), Se(!1);
}
function Ke() {
B(!1), F(!1);
function et() {
U(!1), $(!1), P != null && P.current && (P.current.src = u);
}
const $e = () => {
te(!0);
}, Re = () => {
te(!1);
}, Oe = (E) => {
Ee(E.offsetX);
}, et = (E) => {
const k = Se;
if (k.current && z.current) {
const J = r / b * (E.clientX - z.current.getBoundingClientRect().left);
k.current.currentTime = J, Q(J);
K(!0);
}, Oe = () => {
K(!1);
}, Re = (y) => {
pe(y.offsetX);
}, tt = (y) => {
if (P != null && P.current && L.current) {
const W = r / T * (y.clientX - L.current.getBoundingClientRect().left);
P.current.currentTime = W, ue(W);
}
}, Ze = s / r * b;
return /* @__PURE__ */ de("div", { className: `voice-visualizer ${be ?? ""}`, children: [
/* @__PURE__ */ de(
}, Ze = c / r * T;
return /* @__PURE__ */ ae("div", { className: `voice-visualizer ${le ?? ""}`, children: [
/* @__PURE__ */ ae(
"div",
{
className: `voice-visualizer__canvas-container ${ze ?? ""}`,
ref: Me,
style: { width: Ye(D) },
className: `voice-visualizer__canvas-container ${Q ?? ""}`,
ref: de,
style: { width: Ye(j) },
children: [

@@ -534,9 +532,9 @@ /* @__PURE__ */ a(

{
ref: z,
width: he,
height: ee,
onClick: et,
ref: L,
width: ce,
height: X,
onClick: tt,
style: {
height: Ye(U),
width: b
height: Ye(ee),
width: T
},

@@ -546,14 +544,14 @@ children: "Your browser does not support HTML5 Canvas."

),
se && p && /* @__PURE__ */ de(Ue, { children: [
/* @__PURE__ */ a(Ge, { color: Le }),
/* @__PURE__ */ a(Ge, { color: Le, reflect: !0 }),
he && g && /* @__PURE__ */ ae(Fe, { children: [
/* @__PURE__ */ a(Ge, { color: ge }),
/* @__PURE__ */ a(Ge, { color: ge, reflect: !0 }),
/* @__PURE__ */ a(
"button",
{
onClick: M,
onClick: I,
className: "voice-visualizer__canvas-microphone-btn",
children: /* @__PURE__ */ a(
vt,
ft,
{
color: Te,
color: De,
stroke: 0.5,

@@ -566,25 +564,25 @@ className: "voice-visualizer__canvas-microphone-icon"

] }),
h && o && /* @__PURE__ */ a(
Me && o && /* @__PURE__ */ a(
"p",
{
className: `voice-visualizer__canvas-audio-processing ${pe ?? ""}`,
style: { color: R },
className: `voice-visualizer__canvas-audio-processing ${Te ?? ""}`,
style: { color: D },
children: "Processing Audio..."
}
),
le && f && !o && !i && ae && /* @__PURE__ */ a(
oe && z && !o && !i && q && /* @__PURE__ */ a(
"div",
{
className: `voice-visualizer__progress-indicator-hovered ${G ?? ""}`,
className: `voice-visualizer__progress-indicator-hovered ${C ?? ""}`,
style: {
left: ge
left: me
},
children: T && /* @__PURE__ */ a(
children: x && /* @__PURE__ */ a(
"p",
{
className: `voice-visualizer__progress-indicator-hovered-time
${b - ge < 70 ? "voice-visualizer__progress-indicator-hovered-time-left" : ""}
${_ ?? ""}`,
${T - me < 70 ? "voice-visualizer__progress-indicator-hovered-time-left" : ""}
${s ?? ""}`,
children: ke(
r / b * ge
r / T * me
)

@@ -595,14 +593,14 @@ }

),
Y && f && !o && r ? /* @__PURE__ */ a(
p && z && !o && r ? /* @__PURE__ */ a(
"div",
{
className: `voice-visualizer__progress-indicator ${d ?? ""}`,
className: `voice-visualizer__progress-indicator ${R ?? ""}`,
style: {
left: Ze < b - 1 ? Ze : b - 1
left: Ze < T - 1 ? Ze : T - 1
},
children: C && /* @__PURE__ */ a(
children: V && /* @__PURE__ */ a(
"p",
{
className: `voice-visualizer__progress-indicator-time ${b - s * b / r < 70 ? "voice-visualizer__progress-indicator-time-left" : ""} ${K ?? ""}`,
children: x
className: `voice-visualizer__progress-indicator-time ${T - c * T / r < 70 ? "voice-visualizer__progress-indicator-time-left" : ""} ${ie ?? ""}`,
children: H
}

@@ -615,18 +613,18 @@ )

),
ie && /* @__PURE__ */ de(Ue, { children: [
/* @__PURE__ */ de("div", { className: "voice-visualizer__audio-info-container", children: [
t && /* @__PURE__ */ a("p", { className: "voice-visualizer__audio-info-time", children: m }),
r && !o ? /* @__PURE__ */ a("p", { children: H }) : null
ye && /* @__PURE__ */ ae(Fe, { children: [
/* @__PURE__ */ ae("div", { className: "voice-visualizer__audio-info-container", children: [
t && /* @__PURE__ */ a("p", { className: "voice-visualizer__audio-info-time", children: h }),
r && !o ? /* @__PURE__ */ a("p", { children: b }) : null
] }),
/* @__PURE__ */ de("div", { className: "voice-visualizer__buttons-container", children: [
/* @__PURE__ */ ae("div", { className: "voice-visualizer__buttons-container", children: [
t && /* @__PURE__ */ a(
"button",
{
className: `voice-visualizer__btn-left ${L ? "voice-visualizer__btn-left-microphone" : ""}`,
onClick: v,
className: `voice-visualizer__btn-left ${S ? "voice-visualizer__btn-left-microphone" : ""}`,
onClick: m,
children: /* @__PURE__ */ a(
"img",
{
src: L ? Be : Pe,
alt: L ? "Play" : "Pause"
src: S ? Pe : We,
alt: S ? "Play" : "Pause"
}

@@ -636,7 +634,7 @@ )

),
!p && /* @__PURE__ */ a(
!g && /* @__PURE__ */ a(
"button",
{
className: `voice-visualizer__btn-left ${t ? "voice-visualizer__visually-hidden" : ""}`,
onClick: v,
onClick: m,
disabled: o,

@@ -646,4 +644,4 @@ children: /* @__PURE__ */ a(

{
src: w ? dt : Pe,
alt: w ? "Play" : "Pause"
src: A ? zt : We,
alt: A ? "Play" : "Pause"
}

@@ -653,8 +651,8 @@ )

),
p && /* @__PURE__ */ a(
g && /* @__PURE__ */ a(
"button",
{
className: "voice-visualizer__btn-center",
onClick: M,
children: /* @__PURE__ */ a("img", { src: Be, alt: "Microphone" })
onClick: I,
children: /* @__PURE__ */ a("img", { src: Pe, alt: "Microphone" })
}

@@ -666,11 +664,11 @@ ),

className: `voice-visualizer__btn-center voice-visualizer__btn-center-pause ${t ? "" : "voice-visualizer__visually-hidden"}`,
onClick: u,
children: /* @__PURE__ */ a("img", { src: ft, alt: "Stop" })
onClick: v,
children: /* @__PURE__ */ a("img", { src: gt, alt: "Stop" })
}
),
!p && /* @__PURE__ */ a(
!g && /* @__PURE__ */ a(
"button",
{
onClick: j,
className: `voice-visualizer__btn ${we ?? ""}`,
onClick: _,
className: `voice-visualizer__btn ${Ie ?? ""}`,
disabled: o,

@@ -680,7 +678,7 @@ children: "Clear"

),
ce && n && /* @__PURE__ */ a(
re && n && /* @__PURE__ */ a(
"button",
{
onClick: I,
className: `voice-visualizer__btn ${we ?? ""}`,
onClick: d,
className: `voice-visualizer__btn ${Ie ?? ""}`,
disabled: o,

@@ -691,16 +689,7 @@ children: "Download Audio"

] })
] }),
f && /* @__PURE__ */ a(
"audio",
{
ref: Se,
src: c,
controls: !0,
style: { display: "none" }
}
)
] })
] });
}
);
function Lt({
function wt({
onStartRecording: e,

@@ -711,155 +700,143 @@ onStopRecording: t,

onClearCanvas: c,
onEndAudioPlayback: s,
onStartAudioPlayback: g,
onPausedAudioPlayback: v,
onResumedAudioPlayback: M
onEndAudioPlayback: u,
onStartAudioPlayback: M,
onPausedAudioPlayback: m,
onResumedAudioPlayback: I,
onErrorPlayingAudio: v
} = {}) {
const [u, I] = l(!1), [f, w] = l(!1), [L, o] = l(null), [p, H] = l(new Uint8Array(0)), [m, x] = l(!1), [j, Q] = l(null), [F, B] = l(null), [D, U] = l(0), [V, $] = l(0), [R, q] = l(0), [P, fe] = l(""), [ne, ie] = l(!0), [ce, X] = l(0), [oe, W] = l(!0), [se, Te] = l(!1), [Le, be] = l(!1), [ze, Y] = l(null), d = y(null), C = y(null), K = y(null), ae = y(null), G = y(null), T = y(null), _ = y(null), h = y(null), pe = !!(F && !m), we = ut(R), Se = ot(D), ge = ke(ce), Ee = Le || m;
const [d, z] = l(!1), [A, S] = l(!1), [o, g] = l(null), [b, h] = l(new Uint8Array(0)), [H, _] = l(!1), [ue, $] = l(null), [U, j] = l(null), [ee, F] = l(0), [O, D] = l(0), [B, te] = l(0), [k, J] = l(""), [ye, re] = l(!0), [ne, Y] = l(0), [G, he] = l(!0), [De, ge] = l(!1), [le, Q] = l(null), p = N(null), R = N(null), V = N(null), ie = N(null), q = N(null), C = N(null), x = N(null), s = N(null), Me = !!(U && !H), Te = ht(B), Ie = st(ee), Ee = ke(ne), me = De || H;
Z(() => {
if (!u || f)
if (!d || A)
return;
const A = setInterval(() => {
const S = performance.now();
U((N) => N + (S - V)), $(S);
const f = setInterval(() => {
const E = performance.now();
F((w) => w + (E - O)), D(E);
}, 1e3);
return () => clearInterval(A);
}, [V, f, u]), Z(() => {
if (!j || j.size === 0)
return () => clearInterval(f);
}, [O, A, d]), Z(() => {
if (le) {
K();
return;
(async () => {
var A;
}
}, [le]), Z(() => () => {
K();
}, []), Z(() => (G || window.addEventListener("beforeunload", pe), () => {
window.removeEventListener("beforeunload", pe);
}), [G]);
const pe = (i) => {
i.preventDefault(), i.returnValue = "";
}, T = async (i) => {
if (i)
try {
Y(null);
const S = new Blob([j], {
type: (A = d.current) == null ? void 0 : A.mimeType
}), N = URL.createObjectURL(S);
N && fe(N);
const O = await j.arrayBuffer(), z = new AudioContext(), je = (ve) => {
B(ve), q(ve.duration - 0.06);
}, me = (ve) => {
Y(ve);
};
z.decodeAudioData(
O,
je,
me
if (i.size === 0)
throw new Error("Error: The audio blob is empty");
const f = URL.createObjectURL(i);
J(f);
const E = await i.arrayBuffer(), Ne = await new AudioContext().decodeAudioData(E);
j(Ne), te(Ne.duration - 0.06), Q(null);
} catch (f) {
console.error("Error processing the audio blob:", f), Q(
f instanceof Error ? f : new Error("Error processing the audio blob")
);
} catch (S) {
if (console.error("Error processing the audio blob:", S), S instanceof Error) {
Y(S);
return;
}
Y(new Error("Error processing the audio blob"));
}
})();
}, [j]), Z(() => {
if (ze) {
te();
return;
}
}, [ze]), Z(() => () => {
_.current && cancelAnimationFrame(_.current), G.current && G.current.disconnect(), C.current && C.current.state !== "closed" && C.current.close(), T.current && cancelAnimationFrame(T.current), h != null && h.current && h.current.removeEventListener("ended", re), d.current && d.current.removeEventListener(
"dataavailable",
ue
);
}, []), Z(() => (!oe && !se && window.addEventListener("beforeunload", b), () => {
window.removeEventListener("beforeunload", b);
}), [oe, se]);
const b = (i) => {
i.preventDefault(), i.returnValue = "";
}, Ce = () => {
navigator.mediaDevices.getUserMedia({ audio: !0 }).then((i) => {
te(), W(!1), $(performance.now()), I(!0), o(i), C.current = new window.AudioContext(), K.current = C.current.createAnalyser(), ae.current = new Uint8Array(
K.current.frequencyBinCount
), G.current = C.current.createMediaStreamSource(i), G.current.connect(K.current), d.current = new MediaRecorder(i), d.current.addEventListener(
D(performance.now()), z(!0), g(i), R.current = new window.AudioContext(), V.current = R.current.createAnalyser(), ie.current = new Uint8Array(
V.current.frequencyBinCount
), q.current = R.current.createMediaStreamSource(i), q.current.connect(V.current), p.current = new MediaRecorder(i), p.current.addEventListener(
"dataavailable",
ue
), d.current.start(), ee();
ve
), p.current.start(), X();
}).catch((i) => {
if (console.error("Error starting audio recording:", i), i instanceof Error) {
Y(i);
Q(i);
return;
}
Y(new Error("Error starting audio recording"));
Q(new Error("Error starting audio recording"));
});
}, ee = () => {
K.current.getByteTimeDomainData(ae.current), H(new Uint8Array(ae.current)), T.current = requestAnimationFrame(ee);
}, ue = (i) => {
d.current && Q(i.data);
}, he = () => {
_.current && cancelAnimationFrame(_.current), h.current && (X(h.current.currentTime), _.current = requestAnimationFrame(he));
}, X = () => {
V.current.getByteTimeDomainData(ie.current), h(new Uint8Array(ie.current)), C.current = requestAnimationFrame(X);
}, ve = (i) => {
s.current = new Audio(), $(i.data), T(i.data);
}, ce = () => {
s.current && (Y(s.current.currentTime), x.current = requestAnimationFrame(ce));
}, _e = () => {
u || (e && e(), Ce());
}, le = () => {
u && (t && t(), x(!0), I(!1), U(0), w(!1), T.current && cancelAnimationFrame(T.current), G.current && G.current.disconnect(), C.current && C.current.state !== "closed" && C.current.close(), L == null || L.getTracks().forEach((i) => i.stop()), d.current && (d.current.stop(), d.current.removeEventListener(
K(), he(!1), !d && (e && e(), Ce());
}, oe = () => {
d && (p.current && (p.current.stop(), p.current.removeEventListener(
"dataavailable",
ue
)));
}, te = () => {
T.current && cancelAnimationFrame(T.current), h != null && h.current && h.current.removeEventListener("ended", re), _.current && cancelAnimationFrame(_.current), d.current && (d.current.removeEventListener(
ve
), p.current = null), o == null || o.getTracks().forEach((i) => i.stop()), t && t(), C.current && cancelAnimationFrame(C.current), q.current && q.current.disconnect(), R.current && R.current.state !== "closed" && R.current.close(), _(!0), z(!1), F(0), S(!1));
}, K = () => {
C.current && cancelAnimationFrame(C.current), x.current && cancelAnimationFrame(x.current), p.current && (p.current.removeEventListener(
"dataavailable",
ue
), d.current.stop(), d.current = null), L == null || L.getTracks().forEach((i) => i.stop()), d.current = null, C.current = null, K.current = null, ae.current = null, G.current = null, T.current = null, _.current = null, c && c(), o(null), I(!1), x(!1), Q(null), B(null), U(0), $(0), q(0), fe(""), X(0), ie(!0), w(!1), H(new Uint8Array(0)), Y(null), W(!0);
}, Ne = (i) => {
i instanceof Blob && (te(), Te(!0), W(!1), x(!0), I(!1), U(0), w(!1), Q(i));
}, ye = () => {
var i, A, S, N;
if (u) {
w((O) => !O), ((i = d.current) == null ? void 0 : i.state) === "recording" ? (n && n(), (A = d.current) == null || A.pause(), U((O) => O + (performance.now() - V)), T.current && cancelAnimationFrame(T.current)) : (r && r(), (S = d.current) == null || S.resume(), $(performance.now()), T.current = requestAnimationFrame(ee));
ve
), p.current.stop(), p.current = null), o == null || o.getTracks().forEach((i) => i.stop()), s != null && s.current && (s.current.removeEventListener("ended", se), s.current.pause(), s.current.src = "", s.current = null), p.current = null, R.current = null, V.current = null, ie.current = null, q.current = null, C.current = null, x.current = null, c && c(), g(null), z(!1), _(!1), $(null), j(null), F(0), D(0), te(0), J(""), Y(0), re(!0), S(!1), h(new Uint8Array(0)), Q(null), he(!0);
}, we = () => {
if (s.current && s.current.paused) {
const i = s.current.play();
i !== void 0 && i.catch((f) => {
console.error(f), v && v(
f instanceof Error ? f : new Error("Error playing audio")
);
});
}
}, Le = () => {
var i, f, E;
if (d) {
S((w) => !w), ((i = p.current) == null ? void 0 : i.state) === "recording" ? (n && n(), (f = p.current) == null || f.pause(), F((w) => w + (performance.now() - O)), C.current && cancelAnimationFrame(C.current)) : (r && r(), (E = p.current) == null || E.resume(), D(performance.now()), C.current = requestAnimationFrame(X));
return;
}
if (h.current && pe)
if (_.current && cancelAnimationFrame(_.current), h.current.paused)
g && ce === 0 && g(), M && ce !== 0 && M(), h.current.addEventListener("ended", re), he(), ie(!1), (N = h.current) == null || N.play();
if (s.current && Me)
if (s.current.paused)
M && ne === 0 && M(), I && ne !== 0 && I(), requestAnimationFrame(ce), s.current.addEventListener("ended", se), we(), re(!1);
else {
v && v(), h.current.removeEventListener("ended", re), h.current.pause(), ie(!0);
const O = h.current.currentTime;
X(O), h.current.currentTime = O;
x.current && cancelAnimationFrame(x.current), m && m(), s.current.removeEventListener("ended", se), s.current.pause(), re(!0);
const w = s.current.currentTime;
Y(w), s.current.currentTime = w;
}
}, re = () => {
ie(!0), s && s(), h != null && h.current && (h.current.currentTime = 0, X(0));
}, Ae = () => {
var A;
if (!P)
}, se = () => {
x.current && cancelAnimationFrame(x.current), re(!0), u && u(), s != null && s.current && (s.current.currentTime = 0, Y(0));
}, Se = () => {
var f;
if (!k)
return;
const i = document.createElement("a");
i.href = P, i.download = `recorded_audio${at(
(A = d.current) == null ? void 0 : A.mimeType
)}`, document.body.appendChild(i), i.click(), document.body.removeChild(i), URL.revokeObjectURL(P);
i.href = k, i.download = `recorded_audio${ut(
(f = p.current) == null ? void 0 : f.mimeType
)}`, document.body.appendChild(i), i.click(), document.body.removeChild(i), URL.revokeObjectURL(k);
};
return {
isRecordingInProgress: u,
isPausedRecording: f,
audioData: p,
recordingTime: D,
isProcessingRecordedAudio: Ee,
recordedBlob: j,
mediaRecorder: d.current,
duration: R,
currentAudioTime: ce,
audioSrc: P,
isPausedRecordedAudio: ne,
bufferFromRecordedBlob: F,
isCleared: oe,
isAvailableRecordedAudio: pe,
isPreloadedBlob: se,
formattedDuration: we,
formattedRecordingTime: Se,
formattedRecordedAudioCurrentTime: ge,
setPreloadedAudioBlob: Ne,
audioRef: s,
isRecordingInProgress: d,
isPausedRecording: A,
audioData: b,
recordingTime: ee,
isProcessingRecordedAudio: me,
recordedBlob: ue,
mediaRecorder: p.current,
duration: B,
currentAudioTime: ne,
audioSrc: k,
isPausedRecordedAudio: ye,
bufferFromRecordedBlob: U,
isCleared: G,
isAvailableRecordedAudio: Me,
formattedDuration: Te,
formattedRecordingTime: Ie,
formattedRecordedAudioCurrentTime: Ee,
startRecording: _e,
togglePauseResume: ye,
stopRecording: le,
saveAudioFile: Ae,
clearCanvas: te,
setCurrentAudioTime: X,
error: ze,
_setIsProcessingAudioOnComplete: x,
_setIsProcessingOnResize: be,
audioRef: h
togglePauseResume: Le,
stopRecording: oe,
saveAudioFile: Se,
clearCanvas: K,
setCurrentAudioTime: Y,
error: le,
_setIsProcessingAudioOnComplete: _,
_setIsProcessingOnResize: ge
};
}
export {
It as VoiceVisualizer,
Lt as useVoiceVisualizer
pt as VoiceVisualizer,
wt as useVoiceVisualizer
};

@@ -7,2 +7,3 @@ import { Dispatch, MutableRefObject, SetStateAction } from "react";

export interface Controls {
audioRef: MutableRefObject<HTMLAudioElement | null>;
isRecordingInProgress: boolean;

@@ -20,4 +21,2 @@ isPausedRecording: boolean;

isAvailableRecordedAudio: boolean;
isPreloadedBlob: boolean;
setPreloadedAudioBlob: (blob: unknown) => void;
recordedBlob: Blob | null;

@@ -37,3 +36,2 @@ bufferFromRecordedBlob: AudioBuffer | null;

_setIsProcessingOnResize: Dispatch<SetStateAction<boolean>>;
audioRef: MutableRefObject<HTMLAudioElement | null>;
}

@@ -110,2 +108,3 @@ export interface BarsData {

onResumedAudioPlayback?: () => void;
onErrorPlayingAudio?: (error: Error) => void;
}

@@ -112,0 +111,0 @@ export interface UseWebWorkerParams<T> {

{
"name": "react-voice-visualizer",
"private": false,
"version": "1.3.8",
"version": "1.7.4",
"type": "module",

@@ -6,0 +6,0 @@ "author": "Yurii Zarytskyi",

@@ -47,3 +47,3 @@ # react-voice-visualizer [Demo App](https://react-voice-visualizer.vercel.app/)

```jsx
```typescript jsx
import { useEffect } from "react";

@@ -73,7 +73,7 @@ import { useVoiceVisualizer, VoiceVisualizer } from "react-voice-visualizer";

console.log(error);
console.error(error);
}, [error]);
return (
<VoiceVisualizer controls={recorderControls} ref={audioRef}/>
<VoiceVisualizer ref={audioRef} controls={recorderControls} />
);

@@ -85,49 +85,2 @@ };

Additionally, you can use the setPreloadedAudioBlob function to load any audio data. Pass your audio data in a `Blob` format to this function:
```
setPreloadedAudioBlob(audioBlob);
```
Example:
```jsx
import { useEffect } from 'react';
import { useVoiceVisualizer, VoiceVisualizer } from 'react-voice-visualizer';
const App = () => {
const recorderControls = useVoiceVisualizer();
const {
// ... (Extracted controls and states, if necessary)
setPreloadedAudioBlob,
isPreloadedBlob,
error,
audioRef
} = recorderControls;
useEffect(() => {
// Set the preloaded audioBlob when the component mounts
// Assuming 'audioBlob' is defined somewhere
if (audioBlob) {
setPreloadedAudioBlob(audioBlob);
}
}, [audioBlob]);
// Get and log any error when it occurs
useEffect(() => {
if (!error) return;
console.log(error);
}, [error]);
return (
<VoiceVisualizer
isControlPanelShown={false} // Set to 'false' in most cases, but should be determined based on the specific user's use case.
controls={recorderControls}
ref={audioRef}
/>
);
};
export default App;
```
## Getting started

@@ -157,56 +110,47 @@

| Parameter | Type | Description |
|:-------------------------|:----------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `onStartRecording` | `() => void` | Callback function triggered when recording starts. |
| `onStopRecording` | `() => void` | Callback function triggered when recording stops. |
| `onPausedRecording` | `() => void` | Callback function triggered when recording is paused. |
| `onResumedRecording` | `() => void` | Callback function triggered when recording is resumed. |
| `onClearCanvas` | `() => void` | Callback function triggered when the canvas is cleared. |
| `onEndAudioPlayback` | `() => void` | Callback function triggered when audio playback ends. |
| `onStartAudioPlayback` | `() => void` | Callback function triggered when audio playback starts. |
| `onPausedAudioPlayback` | `() => void` | Callback function triggered when audio playback is paused. |
| `onResumedAudioPlayback` | `() => void` | Callback function triggered when audio playback is resumed. |
| Parameter | Type | Description |
|:-------------------------|:-------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------|
| `onStartRecording` | `() => void` | Callback function triggered when recording starts. |
| `onStopRecording` | `() => void` | Callback function triggered when recording stops. |
| `onPausedRecording` | `() => void` | Callback function triggered when recording is paused. |
| `onResumedRecording` | `() => void` | Callback function triggered when recording is resumed. |
| `onClearCanvas` | `() => void` | Callback function triggered when the canvas is cleared. |
| `onEndAudioPlayback` | `() => void` | Callback function triggered when audio playback ends. |
| `onStartAudioPlayback` | `() => void` | Callback function triggered when audio playback starts. |
| `onPausedAudioPlayback` | `() => void` | Callback function triggered when audio playback is paused. |
| `onResumedAudioPlayback` | `() => void` | Callback function triggered when audio playback is resumed. |
| `onErrorPlayingAudio` | `(error: Error) => void` | Callback function is invoked when an error occurs during the execution of `audio.play()`. It provides an opportunity to handle and respond to such error. |
##### Returns
| Returns | Type | Description |
|:---------------------------------------------------------------|:----------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `audioRef` | `MutableRefObject`<br/>`<HTMLAudioElement \| null>` | Reference to the audio element used for playback. |
| `isRecordingInProgress` | `boolean` | Indicates if audio recording is currently in progress. |
| `isPausedRecording` | `boolean` | Indicates if audio recording is currently paused. |
| `audioData` | `Uint8Array` | Audio data for real-time visualization. |
| `recordingTime` | `number` | Elapsed time during recording in miliseconds. |
| `mediaRecorder` | `MediaRecorder \| null` | MediaRecorder instance used for recording audio. |
| `duration` | `number` | Duration of the recorded audio in seconds. |
| `currentAudioTime` | `number` | Current playback time of the recorded audio in seconds. |
| `audioSrc` | `string` | Source URL of the recorded audio file for playback. |
| `isPausedRecordedAudio` | `boolean` | Indicates if recorded audio playback is paused. |
| `isProcessingRecordedAudio` | `boolean` | Indicates if the recorded audio is being processed and 'Processing Audio...' text shown. |
| `isCleared` | `boolean` | Indicates if the canvas has been cleared. |
| `isPreloadedBlob` | `boolean` | Indicates whether a blob of recorded audio data has been preloaded. |
| `isAvailableRecordedAudio` | `boolean` | Indicates whether recorded audi is available and not currently being processed. This return value can be used to check if it's an appropriate time to work with recorded audio data in your application. |
| `recordedBlob` | `Blob \| null` | Recorded audio data in Blob format. |
| `bufferFromRecordedBlob` | `AudioBuffer \| null` | Audio buffer from the recorded Blob. |
| `formattedDuration` | `string` | Formatted duration time in format 09:51m. |
| `formattedRecordingTime` | `string` | Formatted recording current time in format 09:51. |
| `formattedRecordedAudioCurrentTime` | `string` | Formatted recorded audio current time in format 09:51:1. |
| `setPreloadedAudioBlob` | `(audioBlob: Blob) => void` | This function allows you to load an existing audio blob for further processing, playback and visualization. The `audioBlob` parameter represents the recorded audio data stored in a Blob format. |
| `startRecording` | `() => void` | Function to start audio recording. |
| `togglePauseResume` | `() => void` | Function to toggle pause/resume during recording and playback of recorded audio. |
| `stopRecording` | `() => void` | Function to stop audio recording. |
| `saveAudioFile` | `() => void` | This function allows you to save the recorded audio as a `webm` file format. Please note that it supports saving audio only in the webm format. If you need to save the audio in a different format, you can use external libraries like FFmpeg to convert the Blob to your desired format. This flexibility allows you to tailor the output format according to your specific needs. |
| `clearCanvas` | `() => void` | Function to clear the visualization canvas. |
| `setCurrentAudioTime` | `Dispatch<SetStateAction<number>>` | Internal function to handle current audio time updates during playback. |
| `error` | `Error \| null` | Error object if any error occurred during recording or playback. |
| `_setIsProcessingAudioOnComplete` | `Dispatch<SetStateAction<boolean>>` | Internal function to set IsProcessingAudioOnComplete state. |
| `_setIsProcessingOnResize` | `Dispatch<SetStateAction<boolean>>` | Internal function to set IsProcessingOnResize state. |
| Returns | Type | Description |
|:----------------------------------------|:-------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| `audioRef` | `MutableRefObject`<br/>`<HTMLAudioElement \| null>` | Reference to the audio element used for playback. |
| `isRecordingInProgress` | `boolean` | Indicates if audio recording is currently in progress. |
| `isPausedRecording` | `boolean` | Indicates if audio recording is currently paused. |
| `audioData` | `Uint8Array` | Audio data for real-time visualization. |
| `recordingTime` | `number` | Elapsed time during recording in milliseconds. |
| `mediaRecorder` | `MediaRecorder \| null` | MediaRecorder instance used for recording audio. |
| `duration` | `number` | Duration of the recorded audio in seconds. |
| `currentAudioTime` | `number` | Current playback time of the recorded audio in seconds. |
| `audioSrc` | `string` | Source URL of the recorded audio file for playback. |
| `isPausedRecordedAudio` | `boolean` | Indicates if recorded audio playback is paused. |
| `isProcessingRecordedAudio` | `boolean` | Indicates if the recorded audio is being processed and 'Processing Audio...' text shown. |
| `isCleared` | `boolean` | Indicates if the canvas has been cleared. |
| `isAvailableRecordedAudio` | `boolean` | Indicates whether recorded audi is available and not currently being processed. This return value can be used to check if it's an appropriate time to work with recorded audio data in your application. |
| `recordedBlob` | `Blob \| null` | Recorded audio data in Blob format. |
| `bufferFromRecordedBlob` | `AudioBuffer \| null` | Audio buffer from the recorded Blob. |
| `formattedDuration` | `string` | Formatted duration time in format 09:51m. |
| `formattedRecordingTime` | `string` | Formatted recording current time in format 09:51. |
| `formattedRecordedAudioCurrentTime` | `string` | Formatted recorded audio current time in format 09:51:1. |
| `startRecording` | `() => void` | Function to start audio recording. |
| `togglePauseResume` | `() => void` | Function to toggle pause/resume during recording and playback of recorded audio. |
| `stopRecording` | `() => void` | Function to stop audio recording. |
| `saveAudioFile` | `() => void` | This function allows you to save the recorded audio as a `webm` file format. Please note that it supports saving audio only in the webm format. If you need to save the audio in a different format, you can use external libraries like `FFmpeg` to convert the Blob to your desired format. This flexibility allows you to tailor the output format according to your specific needs. |
| `clearCanvas` | `() => void` | Function to clear the visualization canvas. |
| `setCurrentAudioTime` | `Dispatch<SetStateAction<number>>` | Internal function to handle current audio time updates during playback. |
| `error` | `Error \| null` | Error object if any error occurred during recording or playback. |
| `_setIsProcessingAudioOnComplete` | `Dispatch<SetStateAction<boolean>>` | Internal function to set `isProcessingAudioOnComplete` state. |
| `_setIsProcessingOnResize` | `Dispatch<SetStateAction<boolean>>` | Internal function to set `isProcessingOnResize` state. |
#### Load and visualize any Audio
You can use the setPreloadedAudioBlob function to load any audio data. Pass your audio data as a Blob to this function:
```
setPreloadedAudioBlob(audioBlob);
```
### `VoiceVisualizer` Component

@@ -220,4 +164,4 @@

|:--------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------|:-----------------------------|
| **`ref`** | A reference to the audio element - `audioRef` from the `useVoiceVisualizer` hook. | - | `React.RefObject` (Required) |
| **`controls`** | Provides the audio recording controls and states required for visualization. | - | `Controls` (Required) |
| **`ref`** | A reference to the audio element - `audioRef` from the `useVoiceVisualizer` hook. | - | `React.RefObject` (Required) |
| **`height`** | The height of the visualization canvas. | `200` | `string \| number` (Optional) |

@@ -273,3 +217,3 @@ | **`width`** | The width of the visualization canvas. | `100%` | `string \| number` (Optional) |

This library was created by [Yurii Zarytskyi](https://github.com/YZarytskyi)
This library was created by [Yurii Zarytskyi](https://github.com/YZarytskyi)

@@ -276,0 +220,0 @@ <a href="https://www.linkedin.com/in/yurii-zarytskyi/" rel="nofollow noreferrer">

Sorry, the diff of this file is not supported yet