iOS realizes recording function

Posted by hiprakhar on Mon, 15 Jun 2020 08:37:16 +0200

reference material

https://www.jianshu.com/p/fb7dfb033989

Audio file related knowledge

file format
wav: 
Features: the best sound quality format, corresponding to PCM coding
 Application: multimedia development, preservation of music and audio materials
mp3: 
Features: good sound quality, high compression ratio, supported by a large number of software and hardware
 Application: suitable for music appreciation with high requirements
caf: 
Features: applicable to almost all encoding formats in iOS
Encoding format
PCM
PCM: Pulse code modulation is an uncompressed audio digital technology, which is an uncompressed reproduction of the original sound. In digital mode, the initialization signal of audio is PCM
MP3
AAC
AAC: It's "advanced audio coding."( advanced audio coding)"He was designed to replace MPC Formatted.
HE-AAC
HE-AAC yes AAC A superset of, this“ High efficiency",HE-AAC It is an audio coding format optimized for low bit rate.
AMR
AMR The full name is“ Adaptive Multi-Rate",It's also another one dedicated to "speaking."( speech)"The optimized encoding format is also suitable for low bit rate environment.
ALAC
//Its full name is "Apple Lossless", which is an audio coding method without any quality loss, that is to say, lossless compression.
IMA4
IMA4: This is a 16-bit The format of compressing the audio file according to the compression ratio of 4:1.
Factors affecting the size of audio files
sampling frequency 
Sampling frequency refers to the number of samples per unit time. The larger the sampling frequency is, the smaller the interval between sampling points is, and the more lifelike the sound is after digitization, but the larger the corresponding amount of data is.
Number of sampling bits
 The number of sampling bits is the number of bits that record the value size of each sampling value. There are usually 8 bits or 16 bits in the number of sampling bits. The larger the number of sampling bits, the finer the variation of the recorded sound, and the larger the corresponding amount of data.
Channels 
Channel number refers to whether the processed sound is mono or stereo. In the process of mono processing, there is only one data stream, while stereo requires two data streams of left and right channels. Obviously, the effect of stereo is better, but the corresponding amount of data is double that of mono.
Duration
 calculation
 Data amount (bytes / second) = (sampling frequency (Hz) × sampling bits (bit) × number of channels) / 8
 Total size = amount of data x duration (seconds)
Set request access rights
Apply for access rights and add them to plist file
 < key > nsmicrophoneusagedescription < / key > < string > the app requires your consent to access the microphone < / String >
Measured audio size
Recording 1 minute:
caf format uses 2.6MB
 227KB for mp3 format 

Recording 10 minutes:
The caf format uses 26.5MB
 mp3 format uses 2.3MB 
caf to mp3

Refer to my blog
https://www.jianshu.com/p/62cac1ddb2a5

code implementation

#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface ViewController ()<AVAudioRecorderDelegate,AVAudioPlayerDelegate>{
    
    AVAudioRecorder *recorder;
    AVAudioPlayer *player;
    /** Recording timer */
    NSTimer *recordTimer;
    /** Play timer */
    NSTimer *playTimer;
    /** Recording time */
    NSInteger recordSecond;
    /** Recording minutes */
    NSInteger minuteRecord;
    /** Play time */
    NSInteger playSecond;
    /** Play minutes */
    NSInteger minutePlay;
    /** caf File path */
    NSURL *tmpUrl;
}

/** time */
@property (nonatomic, strong) UILabel *timeLbl;

@end

@implementation ViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    
    [self createUI];
}

#pragma mark - build interface
- (void)createUI{
    
    // To start, create your own UIButton here
    UIButton *startBtn = [UIButton createButtonWithTitle:@"start" sel:@selector(startBtnEvent:) vc:self];
    startBtn.frame = CGRectMake(100, 100, 200, 50);
    [self.view addSubview:startBtn];

    // end
    UIButton *endBtn = [UIButton createButtonWithTitle:@"end" sel:@selector(endBtnEvent:) vc:self];
    endBtn.frame = CGRectMake(100, 150, 200, 50);
    [self.view addSubview:endBtn];

    // play
    UIButton *playBtn = [UIButton createButtonWithTitle:@"play" sel:@selector(playBtnEvent:) vc:self];
    playBtn.frame = CGRectMake(100, 200, 200, 50);
    [self.view addSubview:playBtn];
    
    // time
    self.timeLbl = [[UILabel alloc] initWithFrame:CGRectMake(100, 250, 200, 50)];
    self.timeLbl.textColor = [UIColor blackColor];
    self.timeLbl.textAlignment = NSTextAlignmentCenter;
    self.timeLbl.text = @"00:00";
    [self.view addSubview:self.timeLbl];
}


/**
 start
 */
- (void)startBtnEvent:(UIButton *)btn{

    // Start recording
    [self recordingAction];
}


/**
 end
 */
- (void)endBtnEvent:(UIButton *)btn{

    // Stop recording
    [self stopAction];
}


/**
 play
 */
- (void)playBtnEvent:(UIButton *)btn{

   // Play recording
   [self playAction];
}


/**
 Start recording
 */
- (void)recordingAction {
    
    NSLog(@"Start recording");
    AVAudioSession *audioSession = [AVAudioSession sharedInstance];
    [audioSession setCategory:AVAudioSessionCategoryRecord error:nil];

    //Recording settings
    NSMutableDictionary *recordSettings = [[NSMutableDictionary alloc] init];
    //recording format 
    [recordSettings setValue :[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey: AVFormatIDKey];
    //sampling rate
    [recordSettings setValue :[NSNumber numberWithFloat:11025.0] forKey: AVSampleRateKey];
    //Number of channels
    [recordSettings setValue :[NSNumber numberWithInt:2] forKey: AVNumberOfChannelsKey];
    //Number of linear samples
    [recordSettings setValue :[NSNumber numberWithInt:16] forKey: AVLinearPCMBitDepthKey];
    //Audio quality, sampling quality
    [recordSettings setValue:[NSNumber numberWithInt:AVAudioQualityMin] forKey:AVEncoderAudioQualityKey];

    NSError *error = nil;
    // Sandbox directory Documents address
    NSString *recordUrl = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) firstObject];
    // caf file path
    tmpUrl = [NSURL URLWithString:[recordUrl stringByAppendingPathComponent:@"selfRecord.caf"]];
    recorder = [[AVAudioRecorder alloc]initWithURL:tmpUrl settings:recordSettings error:&error];
    
    if (recorder) {
        //Start or resume recorded recording files
        if ([recorder prepareToRecord] == YES) {
            [recorder record];

            recordSecond = 0;
            minuteRecord = 0;
            recordTimer = [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector(recordSecondChange) userInfo:nil repeats:YES];
            [recordTimer fire];
        }
        
    }else {
        NSLog(@"Recording creation failed");
    }
}


/**
 Recording timing
 */
- (void)recordSecondChange {
    
    recordSecond ++;
    if (recordSecond > 59) {
        
        minuteRecord ++;
        recordSecond = 0;
    }
    self.timeLbl.text = [NSString stringWithFormat:@"%.2ld:%.2ld",(long)minuteRecord,(long)recordSecond];
}


/**
 Stop recording
 */
- (void)stopAction {
    
    NSLog(@"Stop recording");
    //Stop recording
    [recorder stop];
    recorder = nil;
    [recordTimer invalidate];
    recordTimer = nil;
    
    self.timeLbl.text = [NSString stringWithFormat:@"%.2ld:%.2ld",(long)minuteRecord,(long)recordSecond];
}


/**
 Play recording
 */
- (void)playAction {
    AVAudioSession *audioSession = [AVAudioSession sharedInstance];
    [audioSession setCategory:AVAudioSessionCategoryPlayback error:nil];
    NSError *playError;
    
    player = [[AVAudioPlayer alloc] initWithContentsOfURL:tmpUrl error:&playError];
    //Print error message when playing recording is empty
    if (player == nil) {
        NSLog(@"Error crenting player: %@", [playError description]);
    }else {
        player.delegate = self;
        NSLog(@"Start playing");
        //Start playing
        playSecond = recordSecond;
        minutePlay = minuteRecord;
        if ([player prepareToPlay] == YES) {
           
            [player play];
            playTimer = [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector(playSecondChange) userInfo:nil repeats:YES];
            [playTimer fire];
        }
    }
}


/**
 Play timing
 */
- (void)playSecondChange {
    playSecond --;

    if (playSecond < 0) {
        
        if (minutePlay <= 0) {
            
            playSecond = 0;
            minutePlay = 0;
            [playTimer invalidate];
        }else{
            minutePlay --;
            playSecond = 59;
        }
        
    }
    self.timeLbl.text = [NSString stringWithFormat:@"%.2ld:%.2ld",(long)minutePlay,(long)playSecond];
}


//When the play is finished, call this method.
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag {
    NSLog(@"End of play");
    [playTimer invalidate];
    playTimer = nil;
    
    self.timeLbl.text = [NSString stringWithFormat:@"%.2ld:%.2ld",(long)minuteRecord,(long)recordSecond];
}
@end

Topics: encoding iOS