概述
音频可视化 Visualizer
效果图
Visualizer
官方语录:
The Visualizer class enables application to retrieve part of the currently playing audio for visualization purpose. It is not an audio recording interface and only returns partial and low quality audio content. However, to protect privacy of certain audio data (e.g voice mail) the use of the visualizer requires the permission android.permission.RECORD_AUDIO.
The audio session ID passed to the constructor indicates which audio content should be visualized:
- If the session is 0, the audio output mix is visualized
- If the session is not 0, the audio from a particular or using this audio session is visualized
MediaPlayer
AudioTrack
Two types of representation of audio content can be captured:
- Waveform data: consecutive 8-bit (unsigned) mono samples by using the method
getWaveForm(byte[])
- Frequency data: 8-bit magnitude FFT by using the method
getFft(byte[])
The length of the capture can be retrieved or specified by calling respectively and methods. The capture size must be a power of 2 in the range returned by .
getCaptureSize()
setCaptureSize(int)
getCaptureSizeRange()
In addition to the polling capture mode described above with and methods, a callback mode is also available by installing a listener by use of the method. The rate at which the listener capture method is called as well as the type of data returned is specified.
getWaveForm(byte[])
getFft(byte[])
setDataCaptureListener(android.media.audiofx.Visualizer.OnDataCaptureListener, int, boolean, boolean)
Before capturing data, the Visualizer must be enabled by calling the method. When data capture is not needed any more, the Visualizer should be disabled.
setEnabled(boolean)
It is good practice to call the method when the Visualizer is not used anymore to free up native resources associated to the Visualizer instance.
release()
Creating a Visualizer on the output mix (audio session 0) requires permission
Manifest.permission.MODIFY_AUDIO_SETTINGS
The Visualizer class can also be used to perform measurements on the audio being played back. The measurements to perform are defined by setting a mask of the requested measurement modes with . Supported values are to cancel any measurement, and for peak and RMS monitoring. Measurements can be retrieved through .
setMeasurementMode(int)
MEASUREMENT_MODE_NONE
MEASUREMENT_MODE_PEAK_RMS
getMeasurementPeakRms(android.media.audiofx.Visualizer.MeasurementPeakRms)
实现过程
- 通过
MediaPlayer
进行音频播放 - 然后创建
Visualizer
对象,根据Visualizer
需要传递一个audioSessionId
,通过MediaPlayer
的getAudioSessionId
方法获取, - 然后根据
Visualizer
官方文档所述,通过设置setDataCaptureListener
监听,捕获波形数据或者频率数据。 - 然后根据数据遍历绘制图形即可。
代码实现
-
首先根据需要申请权限
<uses-permission android:name="android.permission.RECORD_AUDIO" /> <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" /> <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
RxPermissions rxPermissions = new RxPermissions(this); rxPermissions.requestEach(Manifest.permission.RECORD_AUDIO, Manifest.permission.WRITE_EXTERNAL_STORAGE, Manifest.permission.READ_EXTERNAL_STORAGE) .subscribe(new Consumer<Permission>() { @Override public void accept(Permission permission) throws Exception { if (permission.granted) { Log.d(TAG, "accept: true"); } else if (permission.shouldShowRequestPermissionRationale) { finish(); } else { finish(); } } });
RxPermissions
依赖:implementation 'com.tbruyelle.rxpermissions2:rxpermissions:0.9.5'
-
通过
MediaPlayer
播放音频文件mMediaPlayer = MediaPlayer.create(this, R.raw.daoxiang); mMediaPlayer.setOnErrorListener(null); mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() { @Override public void onPrepared(MediaPlayer mediaPlayer) { mediaPlayer.setLooping(true);//循环播放 } }); mMediaPlayer.start();
-
获取
audioSessionId
int audioSessionId = mediaPlayer.getAudioSessionId();
-
创建
Visualizer
对象visualizer = new Visualizer(audioSessionId); //生成Visualizer实例之后,为其设置可视化数据的大小,其范围是Visualizer.getCaptureSizeRange()[0] ~ Visualizer.getCaptureSizeRange()[1],此处设置为最大值: visualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
-
通过
setDataCaptureListener
为可视化对象设置采样监听数据的回调visualizer.setDataCaptureListener(new Visualizer.OnDataCaptureListener() { @Override public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes, int samplingRate) { } @Override public void onFftDataCapture(Visualizer visualizer, byte[] fft, int samplingRate) { float[] model = new float[fft.length / 2 + 1]; model[0] = (byte) Math.abs(fft[1]); int j = 1; for (int i = 2; i < fft.length / 2; ) { model[j] = (float) Math.hypot(fft[i], fft[i + 1]); i += 2; j++; model[j] = (float) Math.abs(fft[j]); } //model即为最终用于绘制的数据 } }, Visualizer.getMaxCaptureRate() / 2, false, true);
setDataCaptureListener
的参数作用如下:listener:回调对象
rate:采样的频率,其范围是0~Visualizer.getMaxCaptureRate(),此处设置为最大值一半。
waveform:是否获取波形信息
fft:是否获取快速傅里叶变换后的数据OnDataCaptureListener
中的两个回调方法分别为:onWaveFormDataCapture:波形数据回调
onFftDataCapture:傅里叶数据回调,即频率数据回调这里我们采用的是傅里叶数据进行可视化绘制,
onFftDataCapture
中返回的byte数组就是快速傅里叶转换之后的数据,但还需要处理一下:根据上面设置的采样率为
Visualizer.getCaptureSizeRange()[1]
,即1024个采样点,每1024个实数点放入一个数组,进行FFT快速傅里叶变换,得到1024个复数点,由于对称性,前512个点与后512个点对称,取前513个点(包括第0点)其中第0点和第512点为实数,中间511点为复数
onFftDataCapture(Visualizer visualizer, byte[] bytes, int samplingRate)
FFT数据为byte类型,放于byte[1024]中,其中一共1+1+(1024-2)/2=513个有效FFT数据,除了直流和n/2对应的点占一个坑,其他频率数据都是 实部+i虚部 两个坑
获得的频率范围= 0~采样率/2 = 0~22.05kHz之间
即513个频率分布在 [ 0Hz,22.05kHz ]之间
每相邻两个频率间隔(mHz) = 采样率 / (1024 / 2) = 44 100 000 / 512 = 86.132Hz分辨率为86.132Hz,再小的频率间隔将无法分辨
采样率:每秒采集音频流的点数
frequencyEach = samplingRate * 2 / visualizer.getCaptureSize(); //86132 samplingRate=44,100,000 mHz getCaptureSize()=1024
float[] model = new float[fft.length / 2 + 1]; //由于返回的byte数据有可能为负,所以要取绝对值处理: model[0] = (byte) Math.abs(fft[1]); int j = 1; for (int i = 2; i < fft.length / 2; ) { model[j] = (float) Math.hypot(fft[i], fft[i + 1]); i += 2; j++; model[j] = (float) Math.abs(fft[j]); } //model即为最终用于绘制的数据
-
设置Visualizer启动
visualizer.setEnabled(true);
-
绘制图形:
public class VisualizeView extends View { private static final String TAG = "SingleVisualizeView"; /** * the count of spectrum */ protected int mSpectrumCount = 60; /** * the width of every spectrum */ protected float mStrokeWidth; /** * the color of drawing spectrum */ protected int mColor; /** * audio data transform by hypot */ protected float[] mRawAudioBytes; /** * the margin of adjoin spectrum */ protected float mItemMargin = 12; protected float mSpectrumRatio = 2; protected RectF mRect; protected Paint mPaint; protected Path mPath; protected float centerX, centerY; private int mode; public static final int SINGLE = 0; public static final int CIRCLE = 1; public static final int NET = 2; public static final int REFLECT = 3; public static final int WAVE = 4; public static final int GRAIN = 5; float radius = 150; public VisualizeView(Context context) { super(context); init(); } public VisualizeView(Context context, @Nullable AttributeSet attrs) { super(context, attrs); init(); } protected void init() { mStrokeWidth = 5; mPaint = new Paint(); mPaint.setStrokeWidth(mStrokeWidth); mPaint.setColor(getResources().getColor(R.color.black)); mPaint.setStrokeCap(Paint.Cap.ROUND); mPaint.setAntiAlias(true); mPaint.setMaskFilter(new BlurMaskFilter(5, BlurMaskFilter.Blur.SOLID)); mRect = new RectF(); mPath = new Path(); } @Override protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) { super.onMeasure(widthMeasureSpec, heightMeasureSpec); int finallyWidth; int finallyHeight; int wSpecMode = MeasureSpec.getMode(widthMeasureSpec); int wSpecSize = MeasureSpec.getSize(widthMeasureSpec); int hSpecMode = MeasureSpec.getMode(heightMeasureSpec); int hSpecSize = MeasureSpec.getSize(heightMeasureSpec); if (wSpecMode == MeasureSpec.EXACTLY) { finallyWidth = wSpecSize; } else { finallyWidth = 500; } if (hSpecMode == MeasureSpec.EXACTLY) { finallyHeight = hSpecSize; } else { finallyHeight = 500; } setMeasuredDimension(finallyWidth, finallyHeight); } @Override protected void onLayout(boolean changed, int left, int top, int right, int bottom) { super.onLayout(changed, left, top, right, bottom); mRect.set(0, 0, getWidth(), getHeight() - 50); centerX = mRect.width() / 2; centerY = mRect.height() / 2; } @Override protected void onDraw(Canvas canvas) { super.onDraw(canvas); if (mRawAudioBytes == null) { Log.d(TAG, "onDraw: "); return; } drawChild(canvas); } protected void drawChild(Canvas canvas) { mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f; mPaint.setStrokeWidth(mStrokeWidth); mPaint.setStyle(Paint.Style.FILL); switch (mode) { case SINGLE: for (int i = 0; i < mSpectrumCount; i++) { canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2, mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mRawAudioBytes[i], mPaint); } break; case CIRCLE: mStrokeWidth = (float) ((Math.PI * 2 * radius - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f); mPaint.setStyle(Paint.Style.STROKE); mPaint.setStrokeWidth(2); canvas.drawCircle(centerX, centerY, radius, mPaint); mPaint.setStrokeWidth(mStrokeWidth); mPaint.setStyle(Paint.Style.FILL); mPath.moveTo(0, centerY); for (int i = 0; i < mSpectrumCount; i++) { double angel = ((360d / mSpectrumCount * 1.0d) * (i + 1)); double startX = centerX + (radius + mStrokeWidth / 2) * Math.sin(Math.toRadians(angel)); double startY = centerY + (radius + mStrokeWidth / 2) * Math.cos(Math.toRadians(angel)); double stopX = centerX + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.sin(Math.toRadians(angel)); double stopY = centerY + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.cos(Math.toRadians(angel)); canvas.drawLine((float) startX, (float) startY, (float) stopX, (float) stopY, mPaint); } break; case NET: mStrokeWidth = (float) ((Math.PI * 2 * radius - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f); mPaint.setStyle(Paint.Style.STROKE); mPaint.setStrokeWidth(2); canvas.drawCircle(centerX, centerY, radius, mPaint); mPaint.setStrokeWidth(mStrokeWidth); mPaint.setStyle(Paint.Style.FILL); mPath.moveTo(0, centerY); for (int i = 0; i < mSpectrumCount; i++) { double angel = ((360d / mSpectrumCount * 1.0d) * (i + 1)); double startX = centerX + (radius + mStrokeWidth / 2) * Math.sin(Math.toRadians(angel)); double startY = centerY + (radius + mStrokeWidth / 2) * Math.cos(Math.toRadians(angel)); double stopX = centerX + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.sin(Math.toRadians(angel)); double stopY = centerY + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.cos(Math.toRadians(angel)); canvas.drawLine((float) startX, (float) startY, (float) stopX, (float) stopY, mPaint); if (i == 0) { mPath.moveTo((float) startX, (float) startY); } mPath.lineTo((float) stopX, (float) stopY); } mPaint.setStyle(Paint.Style.STROKE); canvas.drawPath(mPath, mPaint); mPath.reset(); break; case REFLECT: mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f; mPaint.setStrokeWidth(mStrokeWidth); mPaint.setStyle(Paint.Style.FILL); for (int i = 0; i < mSpectrumCount; i++) { canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2, mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mSpectrumRatio * mRawAudioBytes[i], mPaint); canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2, mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 + mSpectrumRatio * mRawAudioBytes[i], mPaint); } break; case WAVE: mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f; mPaint.setStrokeWidth(mStrokeWidth); mPaint.setStyle(Paint.Style.FILL); mPath.moveTo(0, centerY); for (int i = 0; i < mSpectrumCount; i++) { mPath.lineTo(mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 + mRawAudioBytes[i]); } mPath.lineTo(mRect.width(), centerY); mPath.close(); canvas.drawPath(mPath, mPaint); mPath.reset(); break; case GRAIN: mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f; mPaint.setStrokeWidth(mStrokeWidth); mPaint.setStyle(Paint.Style.FILL); for (int i = 0; i < mSpectrumCount; i++) { canvas.drawPoint(mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mRawAudioBytes[i], mPaint); canvas.drawPoint(mRect.width() * i / mSpectrumCount, mRect.height() / 4 + 2 + (mRect.height() / 2 - mRawAudioBytes[i]) / 2, mPaint); } break; default: break; } } public void setMode(int mode) { this.mode = mode; if (mRawAudioBytes != null) { invalidate(); } } public void setData(float[] parseData) { mRawAudioBytes = parseData; invalidate(); } }
-
退出应用时进行释放:
@Override protected void onDestroy() { super.onDestroy(); if (mMediaPlayer != null) { mMediaPlayer.stop(); mMediaPlayer.reset(); mMediaPlayer.release(); mMediaPlayer = null; } if (visualizer != null) { visualizer.setEnabled(false); visualizer.release(); } }
tips:Spinner使用
-
新建values目录下arrays的xml文件,配置
<?xml version="1.0" encoding="utf-8"?> <resources> <string-array name="view_type"> <item>SINGLE</item> <item>CIRCLE </item> <item>NET</item> <item>REFLECT</item> <item>WAVE</item> <item>GRAIN</item> </string-array> </resources>
-
使用spinner
<Spinner android:id="@+id/spinner_view" android:layout_width="200px" android:layout_height="wrap_content" android:entries="@array/view_type" app:layout_constraintEnd_toEndOf="parent" app:layout_constraintTop_toTopOf="parent" />
mBinding.spinnerView.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() { @Override public void onItemSelected(AdapterView<?> parent, View view, int position, long id) { mBinding.visualizerView.setMode(position); } @Override public void onNothingSelected(AdapterView<?> parent) { } });
完整代码
-
MainActivity
public class MainActivity extends AppCompatActivity { private static final String TAG = "MainActivity"; Visualizer visualizer; int mCount = 60; ActivityMainBinding mBinding; private MediaPlayer mMediaPlayer; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); mBinding = DataBindingUtil.setContentView(this, R.layout.activity_main); RxPermissions rxPermissions = new RxPermissions(this); rxPermissions.requestEach(Manifest.permission.RECORD_AUDIO, Manifest.permission.WRITE_EXTERNAL_STORAGE, Manifest.permission.READ_EXTERNAL_STORAGE) .subscribe(new Consumer<Permission>() { @Override public void accept(Permission permission) throws Exception { if (permission.granted) { Log.d(TAG, "accept: true"); } else if (permission.shouldShowRequestPermissionRationale) { finish(); } else { finish(); } } }); mMediaPlayer = MediaPlayer.create(this, R.raw.daoxiang); if (mMediaPlayer == null) { Log.d(TAG, "mediaPlayer is null"); return; } mMediaPlayer.setOnErrorListener(null); mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() { @Override public void onPrepared(MediaPlayer mediaPlayer) { mediaPlayer.setLooping(true);//循环播放 int audioSessionId = mediaPlayer.getAudioSessionId(); visualizer = new Visualizer(audioSessionId); visualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]); visualizer.setDataCaptureListener(new Visualizer.OnDataCaptureListener() { @Override public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes, int samplingRate) { } @Override public void onFftDataCapture(Visualizer visualizer, byte[] fft, int samplingRate) { Log.d(TAG, "onFftDataCapture: fft " + fft.length); float[] model = new float[fft.length / 2 + 1]; model[0] = (byte) Math.abs(fft[1]); int j = 1; for (int i = 2; i < fft.length / 2; ) { model[j] = (float) Math.hypot(fft[i], fft[i + 1]); i += 2; j++; model[j] = (float) Math.abs(fft[j]); } //model即为最终用于绘制的数据 mBinding.visualizerView.setData(model); } }, Visualizer.getMaxCaptureRate() / 2, false, true); visualizer.setEnabled(true); } }); mMediaPlayer.start(); mBinding.spinnerView.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() { @Override public void onItemSelected(AdapterView<?> parent, View view, int position, long id) { mBinding.visualizerView.setMode(position); } @Override public void onNothingSelected(AdapterView<?> parent) { } }); } @Override protected void onDestroy() { super.onDestroy(); if (mMediaPlayer != null) { mMediaPlayer.stop(); mMediaPlayer.reset(); mMediaPlayer.release(); mMediaPlayer = null; } if (visualizer != null) { visualizer.setEnabled(false); visualizer.release(); } } }
-
layout文件
<?xml version="1.0" encoding="utf-8"?> <layout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:app="http://schemas.android.com/apk/res-auto" xmlns:tools="http://schemas.android.com/tools"> <data> </data> <androidx.constraintlayout.widget.ConstraintLayout android:layout_width="match_parent" android:layout_height="match_parent" tools:context=".MainActivity"> <Spinner android:id="@+id/spinner_view" android:layout_width="200px" android:layout_height="wrap_content" android:entries="@array/view_type" app:layout_constraintEnd_toEndOf="parent" app:layout_constraintTop_toTopOf="parent" /> <com.learn.visualizer.VisualizeView android:id="@+id/visualizer_view" android:layout_width="match_parent" android:layout_height="match_parent" app:layout_constraintBottom_toBottomOf="parent" app:layout_constraintLeft_toLeftOf="parent" app:layout_constraintRight_toRightOf="parent" app:layout_constraintTop_toTopOf="parent" /> </androidx.constraintlayout.widget.ConstraintLayout> </layout>
-
自定义View
package com.learn.visualizer; import android.content.Context; import android.graphics.BlurMaskFilter; import android.graphics.Canvas; import android.graphics.Paint; import android.graphics.Path; import android.graphics.RectF; import android.util.AttributeSet; import android.util.Log; import android.view.View; import androidx.annotation.Nullable; public class VisualizeView extends View { private static final String TAG = "SingleVisualizeView"; /** * the count of spectrum */ protected int mSpectrumCount = 60; /** * the width of every spectrum */ protected float mStrokeWidth; /** * the color of drawing spectrum */ protected int mColor; /** * audio data transform by hypot */ protected float[] mRawAudioBytes; /** * the margin of adjoin spectrum */ protected float mItemMargin = 12; protected float mSpectrumRatio = 2; protected RectF mRect; protected Paint mPaint; protected Path mPath; protected float centerX, centerY; private int mode; public static final int SINGLE = 0; public static final int CIRCLE = 1; public static final int NET = 2; public static final int REFLECT = 3; public static final int WAVE = 4; public static final int GRAIN = 5; float radius = 150; public VisualizeView(Context context) { super(context); init(); } public VisualizeView(Context context, @Nullable AttributeSet attrs) { super(context, attrs); init(); } protected void init() { mStrokeWidth = 5; mPaint = new Paint(); mPaint.setStrokeWidth(mStrokeWidth); mPaint.setColor(getResources().getColor(R.color.black)); mPaint.setStrokeCap(Paint.Cap.ROUND); mPaint.setAntiAlias(true); mPaint.setMaskFilter(new BlurMaskFilter(5, BlurMaskFilter.Blur.SOLID)); mRect = new RectF(); mPath = new Path(); } @Override protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) { super.onMeasure(widthMeasureSpec, heightMeasureSpec); int finallyWidth; int finallyHeight; int wSpecMode = MeasureSpec.getMode(widthMeasureSpec); int wSpecSize = MeasureSpec.getSize(widthMeasureSpec); int hSpecMode = MeasureSpec.getMode(heightMeasureSpec); int hSpecSize = MeasureSpec.getSize(heightMeasureSpec); if (wSpecMode == MeasureSpec.EXACTLY) { finallyWidth = wSpecSize; } else { finallyWidth = 500; } if (hSpecMode == MeasureSpec.EXACTLY) { finallyHeight = hSpecSize; } else { finallyHeight = 500; } setMeasuredDimension(finallyWidth, finallyHeight); } @Override protected void onLayout(boolean changed, int left, int top, int right, int bottom) { super.onLayout(changed, left, top, right, bottom); mRect.set(0, 0, getWidth(), getHeight() - 50); centerX = mRect.width() / 2; centerY = mRect.height() / 2; } @Override protected void onDraw(Canvas canvas) { super.onDraw(canvas); if (mRawAudioBytes == null) { Log.d(TAG, "onDraw: "); return; } drawChild(canvas); } protected void drawChild(Canvas canvas) { mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f; mPaint.setStrokeWidth(mStrokeWidth); mPaint.setStyle(Paint.Style.FILL); switch (mode) { case SINGLE: for (int i = 0; i < mSpectrumCount; i++) { canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2, mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mRawAudioBytes[i], mPaint); } break; case CIRCLE: mStrokeWidth = (float) ((Math.PI * 2 * radius - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f); mPaint.setStyle(Paint.Style.STROKE); mPaint.setStrokeWidth(2); canvas.drawCircle(centerX, centerY, radius, mPaint); mPaint.setStrokeWidth(mStrokeWidth); mPaint.setStyle(Paint.Style.FILL); mPath.moveTo(0, centerY); for (int i = 0; i < mSpectrumCount; i++) { double angel = ((360d / mSpectrumCount * 1.0d) * (i + 1)); double startX = centerX + (radius + mStrokeWidth / 2) * Math.sin(Math.toRadians(angel)); double startY = centerY + (radius + mStrokeWidth / 2) * Math.cos(Math.toRadians(angel)); double stopX = centerX + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.sin(Math.toRadians(angel)); double stopY = centerY + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.cos(Math.toRadians(angel)); canvas.drawLine((float) startX, (float) startY, (float) stopX, (float) stopY, mPaint); } break; case NET: mStrokeWidth = (float) ((Math.PI * 2 * radius - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f); mPaint.setStyle(Paint.Style.STROKE); mPaint.setStrokeWidth(2); canvas.drawCircle(centerX, centerY, radius, mPaint); mPaint.setStrokeWidth(mStrokeWidth); mPaint.setStyle(Paint.Style.FILL); mPath.moveTo(0, centerY); for (int i = 0; i < mSpectrumCount; i++) { double angel = ((360d / mSpectrumCount * 1.0d) * (i + 1)); double startX = centerX + (radius + mStrokeWidth / 2) * Math.sin(Math.toRadians(angel)); double startY = centerY + (radius + mStrokeWidth / 2) * Math.cos(Math.toRadians(angel)); double stopX = centerX + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.sin(Math.toRadians(angel)); double stopY = centerY + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.cos(Math.toRadians(angel)); canvas.drawLine((float) startX, (float) startY, (float) stopX, (float) stopY, mPaint); if (i == 0) { mPath.moveTo((float) startX, (float) startY); } mPath.lineTo((float) stopX, (float) stopY); } mPaint.setStyle(Paint.Style.STROKE); canvas.drawPath(mPath, mPaint); mPath.reset(); break; case REFLECT: mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f; mPaint.setStrokeWidth(mStrokeWidth); mPaint.setStyle(Paint.Style.FILL); for (int i = 0; i < mSpectrumCount; i++) { canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2, mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mSpectrumRatio * mRawAudioBytes[i], mPaint); canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2, mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 + mSpectrumRatio * mRawAudioBytes[i], mPaint); } break; case WAVE: mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f; mPaint.setStrokeWidth(mStrokeWidth); mPaint.setStyle(Paint.Style.FILL); mPath.moveTo(0, centerY); for (int i = 0; i < mSpectrumCount; i++) { mPath.lineTo(mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 + mRawAudioBytes[i]); } mPath.lineTo(mRect.width(), centerY); mPath.close(); canvas.drawPath(mPath, mPaint); mPath.reset(); break; case GRAIN: mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f; mPaint.setStrokeWidth(mStrokeWidth); mPaint.setStyle(Paint.Style.FILL); for (int i = 0; i < mSpectrumCount; i++) { canvas.drawPoint(mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mRawAudioBytes[i], mPaint); canvas.drawPoint(mRect.width() * i / mSpectrumCount, mRect.height() / 4 + 2 + (mRect.height() / 2 - mRawAudioBytes[i]) / 2, mPaint); } break; default: break; } } public void setMode(int mode) { this.mode = mode; if (mRawAudioBytes != null) { invalidate(); } } public void setData(float[] parseData) { mRawAudioBytes = parseData; invalidate(); } }
参考文档
- https://www.jianshu.com/p/c95bb166fb28
- https://blog.csdn.net/gkw421178132/article/details/71081628
- https://developer.android.google.cn/reference/android/media/audiofx/Visualizer#setDataCaptureListener(android.media.audiofx.Visualizer.OnDataCaptureListener,%20int,%20boolean,%20boolean)
最后
以上就是兴奋航空为你收集整理的Android 音频可视化 Visualizer的全部内容,希望文章能够帮你解决Android 音频可视化 Visualizer所遇到的程序开发问题。
如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。
发表评论 取消回复