首先,在下面的代码中,我试图做的是使用 'byteBuffer[0].length' 找到 2D 字节数组的长度,但它实际上不起作用。当我打印 'byteBuffer[0].length' 时,它给出的输出是 4 而不是 882000,根据我传递的参数,它(后者)应该是正确的输出。那么如何在我的循环中迭代它呢?
其次,我想在“ByteArrayInputStream”中传递“byteBuffer”,但在“ByteArrayInputStream”中我们不能传递二维数组。那么有没有办法附加值并在那里使用它?而且我还需要交替传递'Frequency1'和'Frequency2'的值并将它们保存为.wav格式,以便我可以在我的媒体播放器中相应地播放它们。例如:救护车的警报器。
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.IOException;
import java.util.Scanner;
import javax.sound.sampled.AudioFileFormat;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
public class AudioPlay {
public static void main(String[] args) throws IOException {
Scanner in = new Scanner(System.in);
final double SAMPLING_RATE = 44100; // Audio sampling rate
int time = in.nextInt(); //Time specified by user in seconds
int frequency1 = in.nextInt(); //Frequency specified by the user in hz
int frequency2 = in.nextInt();
//Size of buffer, in case time is 10 seconds it will be [2][441000]
float buffer[][] = new float[2][(int) (time * SAMPLING_RATE)];
for (int sample = 0; sample < buffer[0].length; sample++) {
double cycle = sample / SAMPLING_RATE; //Fraction of cycle between samples
buffer[0][sample] = (float) (Math.sin(2 * Math.PI * frequency1 * cycle)); //Storing value at every index of 1st row
buffer[1][sample] = (float) (Math.sin(2 * Math.PI * frequency2 * cycle)); //Storing value at every index of 2nd row
}
//Size of byteBuffer, in case time is 10sec it will be [2][882000]
byte byteBuffer[][] = new byte[2][(int)(buffer.length * 2)];
System.out.println(byteBuffer[0].length); // Giving wrong output
int count = 0;
for (int j = 0; j < byteBuffer.length; j++) {
for (int i = 0; i < byteBuffer[0].length; i++) {
final int x = (int) ((buffer[j][count++]) * Short.MAX_VALUE);
byteBuffer[j][i++] = (byte) x;
byteBuffer[j][i] = (byte) (x / 256); //Total Value of Byte
}
}
File out = new File("E:/RecordAudio7.wav"); //The path where user want the file data to be written
//Construct an audio format, using 44100hz sampling rate, 16 bit samples, mono, and big
// endian byte ordering
AudioFormat format = new AudioFormat((float) SAMPLING_RATE, 16, 1, true, false);
// It uses bytebuffer as its buffer array that contains bytes that may be read from the stream.
ByteArrayInputStream bais = new ByteArrayInputStream(byteBuffer[0]);
//Constructs an audio input stream that has the requested format and length in sample frames, using audio data
//from the specified input stream.
AudioInputStream audioInputStream = new AudioInputStream(bais, format, buffer.length);
//Writes a stream of bytes representing an audio file of the specified file type to the external file provided.
AudioSystem.write(audioInputStream, AudioFileFormat.Type.WAVE, out);
audioInputStream.close(); //Closes this audio input stream
}
}