Based on hongsoft face recognition, vip welcome system is developed and implemented

Posted by Zhadus on Mon, 17 Jan 2022 04:53:14 +0100

Application scenario:

The marked face information is recognized through hongruan sdk, and the recognized results are sent to android client to notify business personnel, so as to obtain customer information at the first time. Welcome system, Shangchao special personnel notification system, etc. can be used


The effects are as follows:

Face recognition welcome system

Function point

  • Face recognition
  • Face base library batch warehousing
  • Video or camera face recognition
  • Android terminal receives face information in real time
  • Android voice broadcast face information

development tool

  • IDEA
  • Android Studio
  • Navicat for mysql

Technical architecture

  • java 8
  • spring boot
  • mysql 5.7+
  • android
  • Hongruan face recognition sdk value-added version java 4.0

Operation instructions

  • 1. First create arc in mysql_ face_ Base database, the database code is utf8mb4, and then it is imported into user in doc_ face_ info. Content of SQL
  • 2. Go to hongruan's official website http://ai.arcsoft.com.cn/ Apply to download the SDK for free. The value-added version supports trial. The engine library you apply for will be libarcsoft_face,libarcsoft_face_engine,libarcsoft_face_ engine_ Download JNI to the specified folder for backup. Pay attention to distinguish X86 and X64, which is consistent with the current jdk version.
  • 3.git downloads the server-side source code and changes it to its own configuration information. First, change and modify the configuration file SRC \ main \ resources \ application properties
    Inventory placement path of face recognition engine: config arcface-sdk. sdk-lib-path
    Storage address of face database: config arcface-sdk. base-image-path
    Fill in the face recognition ID: config arcface-sdk. app-id
    Fill in the face recognition key: config arcface-sdk. sdk-key

Database connection: spring datasource. druid. url
Database user name: spring datasource. druid. username
Database password: spring datasource. druid. password

You can use COM itboyst. facedemo. Faceenginetest tests whether the SDK can run normally

package com.itboyst.facedemo;
import com.arcsoft.face.*;
import com.arcsoft.face.enums.DetectMode;
import com.arcsoft.face.enums.DetectModel;
import com.arcsoft.face.enums.DetectOrient;
import com.arcsoft.face.enums.ExtractType;
import com.arcsoft.face.toolkit.ImageFactory;
import com.arcsoft.face.toolkit.ImageInfo;
import com.arcsoft.face.toolkit.ImageInfoEx;

import java.io.File;
import java.util.ArrayList;
import java.util.List;
public class FaceEngineTest {
    public static void main(String[] args) {

        //Activation code, obtained from the official website
        String appId = "";
        String sdkKey = "";
        String activeKey = "";

        System.err.println("Note that if you return errorCode Not 0, viewable com.arcsoft.face.enums.ErrorInfo Class to get the corresponding error information");

        //Face recognition engine inventory placement path
        FaceEngine faceEngine = new FaceEngine("F:\\WIN64");
        //Activate engine
        int errorCode = faceEngine.activeOnline(appId, sdkKey, activeKey);
        System.out.println("Engine activation errorCode:" + errorCode);


        ActiveDeviceInfo activeDeviceInfo = new ActiveDeviceInfo();
        //Collect equipment information (offline)
        errorCode = faceEngine.getActiveDeviceInfo(activeDeviceInfo);
        System.out.println("Collecting equipment information errorCode:" + errorCode);
        System.out.println("Equipment information:"+activeDeviceInfo.getDeviceInfo());

        //faceEngine.activeOffline("d:\\ArcFacePro64.dat.offline");

        //ActiveFileInfo activeFileInfo = new ActiveFileInfo();
        //errorCode = faceEngine.getActiveFileInfo(activeFileInfo);
        //System.out.println("get the activation file errorCode:" + errorCode);
        //System.out.println("active file information:" + activeFileInfo.toString());

        //Engine configuration
        EngineConfiguration engineConfiguration = new EngineConfiguration();
        engineConfiguration.setDetectMode(DetectMode.ASF_DETECT_MODE_IMAGE);
        engineConfiguration.setDetectFaceOrientPriority(DetectOrient.ASF_OP_ALL_OUT);
        engineConfiguration.setDetectFaceMaxNum(10);
        //Function configuration
        FunctionConfiguration functionConfiguration = new FunctionConfiguration();
        functionConfiguration.setSupportAge(true);
        functionConfiguration.setSupportFace3dAngle(true);
        functionConfiguration.setSupportFaceDetect(true);
        functionConfiguration.setSupportFaceRecognition(true);
        functionConfiguration.setSupportGender(true);
        functionConfiguration.setSupportLiveness(true);
        functionConfiguration.setSupportIRLiveness(true);
        functionConfiguration.setSupportImageQuality(true);
        functionConfiguration.setSupportMaskDetect(true);
        functionConfiguration.setSupportFaceLandmark(true);
        functionConfiguration.setSupportUpdateFaceData(true);
        functionConfiguration.setSupportFaceShelter(true);
        engineConfiguration.setFunctionConfiguration(functionConfiguration);

        //Initialize engine
        errorCode = faceEngine.init(engineConfiguration);
        System.out.println("Initialize engine errorCode:" + errorCode);


        //Face detection
        ImageInfo imageInfo = ImageFactory.getRGBData(new File("xxx.jpg"));
        List<FaceInfo> faceInfoList = new ArrayList<FaceInfo>();
        errorCode = faceEngine.detectFaces(imageInfo, faceInfoList);
        System.out.println("Face detection errorCode:" + errorCode);
        System.out.println("Number of faces detected:" + faceInfoList.size());


        ImageQuality imageQuality = new ImageQuality();
        errorCode = faceEngine.imageQualityDetect(imageInfo, faceInfoList.get(0),0, imageQuality);
        System.out.println("Image quality detection errorCode:" + errorCode);
        System.out.println("Image quality score:" + imageQuality.getFaceQuality());

        //feature extraction 
        FaceFeature faceFeature = new FaceFeature();
        errorCode = faceEngine.extractFaceFeature(imageInfo, faceInfoList.get(0), ExtractType.REGISTER,0, faceFeature);
        System.out.println("feature extraction  errorCode:" + errorCode);

        //Face detection 2
        ImageInfo imageInfo2 = ImageFactory.getRGBData(new File("xxx.jpg"));
        List<FaceInfo> faceInfoList2 = new ArrayList<FaceInfo>();
        errorCode = faceEngine.detectFaces(imageInfo2, faceInfoList2);
        System.out.println("Face detection errorCode:" + errorCode);
        System.out.println("Number of faces detected:" + faceInfoList.size());

        //Feature extraction 2
        FaceFeature faceFeature2 = new FaceFeature();
        errorCode = faceEngine.extractFaceFeature(imageInfo2, faceInfoList.get(0), ExtractType.REGISTER,0, faceFeature);
        System.out.println("feature extraction  errorCode:" + errorCode);

        //Feature comparison
        FaceFeature targetFaceFeature = new FaceFeature();
        targetFaceFeature.setFeatureData(faceFeature.getFeatureData());
        FaceFeature sourceFaceFeature = new FaceFeature();
        sourceFaceFeature.setFeatureData(faceFeature2.getFeatureData());
        FaceSimilar faceSimilar = new FaceSimilar();

        errorCode = faceEngine.compareFaceFeature(targetFaceFeature, sourceFaceFeature, faceSimilar);
        System.out.println("Feature comparison errorCode:" + errorCode);
        System.out.println("Face similarity:" + faceSimilar.getScore());


        //Face attribute detection
        FunctionConfiguration configuration = new FunctionConfiguration();
        configuration.setSupportAge(true);
        configuration.setSupportFace3dAngle(true);
        configuration.setSupportGender(true);
        configuration.setSupportLiveness(true);
        configuration.setSupportMaskDetect(true);
        configuration.setSupportFaceLandmark(true);
        errorCode = faceEngine.process(imageInfo, faceInfoList, configuration);
        System.out.println("Image attribute processing errorCode:" + errorCode);

        //Gender testing
        List<GenderInfo> genderInfoList = new ArrayList<GenderInfo>();
        errorCode = faceEngine.getGender(genderInfoList);
        System.out.println("Gender:" + genderInfoList.get(0).getGender());

        //Age detection
        List<AgeInfo> ageInfoList = new ArrayList<AgeInfo>();
        errorCode = faceEngine.getAge(ageInfoList);
        System.out.println("Age:" + ageInfoList.get(0).getAge());

        //3D information detection
        List<Face3DAngle> face3DAngleList = new ArrayList<Face3DAngle>();
        errorCode = faceEngine.getFace3DAngle(face3DAngleList);
        System.out.println("3D Angle:" + face3DAngleList.get(0).getPitch() + "," + face3DAngleList.get(0).getRoll() + "," + face3DAngleList.get(0).getYaw());

        //In vivo detection
        List<LivenessInfo> livenessInfoList = new ArrayList<LivenessInfo>();
        errorCode = faceEngine.getLiveness(livenessInfoList);
        System.out.println("living thing:" + livenessInfoList.get(0).getLiveness());


        //Landmark detection
        List<LandmarkInfo> landmarkInfoList = new ArrayList<LandmarkInfo>();
        errorCode = faceEngine.getLandmark(landmarkInfoList);
        System.out.println("Landmark: " + landmarkInfoList.get(0).getLandmarks()[0].getX());

        //Mask detection
        List<MaskInfo> maskInfoList = new ArrayList<MaskInfo>();
        errorCode = faceEngine.getMask(maskInfoList);
        System.out.println("Mask:" + maskInfoList.get(0).getMask());


        //IR attribute processing
        ImageInfo imageInfoGray = ImageFactory.getGrayData(new File("xxx.jpg"));
        List<FaceInfo> faceInfoListGray = new ArrayList<FaceInfo>();
        errorCode = faceEngine.detectFaces(imageInfoGray, faceInfoListGray);

        FunctionConfiguration configuration2 = new FunctionConfiguration();
        configuration2.setSupportIRLiveness(true);
        errorCode = faceEngine.processIr(imageInfoGray, faceInfoListGray, configuration2);
        //IR in vivo detection
        List<IrLivenessInfo> irLivenessInfo = new ArrayList();
        errorCode = faceEngine.getLivenessIr(irLivenessInfo);
        System.out.println("IR living thing:" + irLivenessInfo.get(0).getLiveness());


        //Get active file information
        ActiveFileInfo activeFileInfo2 = new ActiveFileInfo();
        errorCode = faceEngine.getActiveFileInfo(activeFileInfo2);

        //Update face data
        errorCode = faceEngine.updateFaceData(imageInfo, faceInfoList);

        //Advanced face image processing interface
        ImageInfoEx imageInfoEx = new ImageInfoEx();
        imageInfoEx.setHeight(imageInfo.getHeight());
        imageInfoEx.setWidth(imageInfo.getWidth());
        imageInfoEx.setImageFormat(imageInfo.getImageFormat());
        imageInfoEx.setImageDataPlanes(new byte[][]{imageInfo.getImageData()});
        imageInfoEx.setImageStrides(new int[]{imageInfo.getWidth() * 3});
        List<FaceInfo> faceInfoList1 = new ArrayList();
        errorCode = faceEngine.detectFaces(imageInfoEx, DetectModel.ASF_DETECT_MODEL_RGB, faceInfoList1);
        ImageQuality imageQuality1=new ImageQuality();
        errorCode = faceEngine.imageQualityDetect(imageInfoEx, faceInfoList1.get(0),0, imageQuality1);
        FunctionConfiguration fun = new FunctionConfiguration();
        fun.setSupportAge(true);
        errorCode = faceEngine.process(imageInfoEx, faceInfoList1, fun);
        List<AgeInfo> ageInfoList1 = new ArrayList();
        int age = faceEngine.getAge(ageInfoList1);
        FaceFeature feature = new FaceFeature();
        errorCode = faceEngine.extractFaceFeature(imageInfoEx, faceInfoList1.get(0), ExtractType.REGISTER,0,feature);
        errorCode = faceEngine.updateFaceData(imageInfoEx,faceInfoList1);

        //Set up live test
        errorCode = faceEngine.setLivenessParam(0.5f, 0.7f);
        System.out.println("Set living threshold errorCode:" + errorCode);

        errorCode=faceEngine.setFaceShelterParam(0.8f);
        System.out.println("Set face occlusion threshold errorCode:" + errorCode);

        //Engine uninstall
        errorCode = faceEngine.unInit();

    }
}

Face base library batch warehousing

@RunWith(SpringRunner.class)
@SpringBootTest
public class FaceBatchAddTest {

    @Autowired
    UserFaceInfoService userFaceInfoService;
    @Autowired
    FaceEngineService faceEngineService;

    @Test
    public void faceBatchAdd() {
        int errorCode;
        int baseImgCount = 0;
        //Face feature acquisition
        Map<String, File> imgInfo = getImgInfo("Set the bottom library picture path here,The bottom library photos should preferably be in the form of one inch photos,The photo name is this person's name*");
        for (Map.Entry<String, File> entry : imgInfo.entrySet()) {
            String name = entry.getKey();
            File file = entry.getValue();
            try {
                BufferedImage bufferedImage = ImageIO.read(file);
                //Load face database
                ImageInfo rgbData = ImageFactory.getRGBData(file);
                List<FaceDetectResDTO> faceDetectResDTOS = faceEngineService.detectFacesByAdd(bufferedImage);
                if (CollectionUtil.isNotEmpty(faceDetectResDTOS) && faceDetectResDTOS.size() > 0) {
                    List<FaceInfo> faceInfoList = new ArrayList<>();
                    for (FaceDetectResDTO faceDetectResDTO : faceDetectResDTOS) {
                        FaceInfo faceInfo = new FaceInfo();
                        faceInfo.setRect(faceDetectResDTO.getRect());
                        faceInfo.setFaceId(faceDetectResDTO.getFaceId());
                        faceInfo.setOrient(faceDetectResDTO.getOrient());
                        faceInfo.setWearGlasses(faceDetectResDTO.getWearGlasses());
                        faceInfo.setLeftEyeClosed(faceDetectResDTO.getLeftEyeClosed());
                        faceInfo.setRightEyeClosed(faceDetectResDTO.getRightEyeClosed());
                        faceInfo.setFaceShelter(faceDetectResDTO.getFaceShelter());
                        faceInfo.setFaceData(faceDetectResDTO.getFaceData());
                        faceInfoList.add(faceInfo);
                    }
                    byte[] feature = faceEngineService.extractFaceFeature(rgbData, faceInfoList.get(0));
                    UserFaceInfo userFaceInfo = new UserFaceInfo();
                    userFaceInfo.setName(name);
                    userFaceInfo.setGroupId(101);
                    userFaceInfo.setFaceFeature(feature);
                    userFaceInfo.setBaseImgPath(faceDetectResDTOS.get(0).getBaseImgPath());
                    userFaceInfo.setFaceId(RandomUtil.randomInt(1000000));

                    //Insert face features into database
                    userFaceInfoService.insertSelective(userFaceInfo);
                }
                baseImgCount++;
                System.out.println("baseFeats:" + baseImgCount + "/" + imgInfo.size());
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
    }
}

The bottom library style is as follows:

After configuration, execute the faceBatchAdd method Note that if an error is reported
java.awt.HeadlessException at java.awt.GraphicsEnvironment.checkHeadless(GraphicsEnvironment.java:204) at java.awt.Window.<init>(Window.java:536) at java.awt.Frame.<init>(Frame.java:420) at java.awt.Frame.<init>(Frame.java:385) at javax.swing.SwingUtilities$SharedOwnerFrame.<init>(SwingUtilities.java:1763) at javax.swing.SwingUtilities.getSharedOwnerFrame(SwingUtilities.java:1838) at javax.swing.JDialog.<init>(JDialog.java:272) at javax.swing.JDialog.<init>(JDialog.java:206) at javax.swing.JDialog.<init>(JDialog.java:154) at com.itboyst.facedemo.util.ImageGUI.createWin(ImageGUI.java:40) at com.itboyst.facedemo.OpencvTest.testOpencv(OpencvTest.java:65)
Solution: add - DJava. Java to the jvm options of the execution class awt. headless=false
Check whether the face information is stored in the database

After normal warehousing, you can use COM itboyst. facedemo. Application starts the project. After the project starts, it opens in the browser http://127.0.0.1:8099/ Access project


At this time, upload the face photo information existing in the bottom database and you can recognize it
You can also set the face base information on the single sheet at the position shown in the figure below for future use

Android Client Deployment

git downloads the Android source code, opens the project in android studio, compiles and deploys it to the mobile phone,

The user name and password here are 1, and then click login. If you normally enter the next interface, you can receive the face recognition results If the login fails, check whether the server ip is the server ip, whether the spring boot project on the server starts normally, and whether the server firewall opens port 8888
After successful login, you can upload photos through the location shown in the figure below, you can receive face recognition information on android, and you can also hear sounds

The following effects

Face recognition welcome system

Video or camera face recognition

The source code is as follows: com itboyst. facedemo. OpencvTest

package com.itboyst.facedemo;

import cn.hutool.core.collection.CollectionUtil;
import com.arcsoft.face.FaceInfo;
import com.arcsoft.face.Rect;
import com.arcsoft.face.toolkit.ImageInfo;
import com.google.common.collect.Lists;
import com.itboyst.facedemo.dto.FaceDetectResDTO;
import com.itboyst.facedemo.dto.ProcessInfo;
import com.itboyst.facedemo.dto.UserCompareInfo;
import com.itboyst.facedemo.service.FaceEngineService;
import com.itboyst.facedemo.service.UserFaceInfoService;
import com.itboyst.facedemo.util.ImageGUI;
import com.itboyst.facedemo.util.ImageUtil;
import com.itboyst.facedemo.util.ShowVideo;
import com.itboyst.facedemo.util.UserRamCache;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.opencv.core.Core;
import org.opencv.core.Mat;
import org.opencv.videoio.VideoCapture;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringRunner;

import java.awt.*;
import java.awt.image.BufferedImage;
import java.util.List;

import static com.arcsoft.face.toolkit.ImageFactory.bufferedImage2ImageInfo;

@RunWith(SpringRunner.class)
@SpringBootTest
public class OpencvTest {

    @Autowired
    UserFaceInfoService userFaceInfoService;
    @Autowired
    FaceEngineService faceEngineService;

    @Test
    public void testOpencv() {
        System.load("C:\\lib\\opencv_java320.dll");
        System.load("C:\\lib\\opencv_ffmpeg320_64.dll");
        System.load("C:\\lib\\opencv_world320.dll");

        // Open camera or video file
        // If the device is 0, the laptop's own camera is turned on by default. If it is 0, the external usb camera cannot be turned on
        // Please change the device to 1 or 2 and try again. 1 or 2 is the usb device id approved by the computer after usb is plugged into the computer
        VideoCapture capture = new VideoCapture();
        //capture.open(0);
        capture.open("C:\\vip1.mp4");
        if (!capture.isOpened()) {
            System.out.println("could not load video data...");
            return;
        }
        int frameWidth = (int) capture.get(3);
        //int frameWidth = 720;
        int frameHeight = (int) capture.get(4);
        //int frameHeight = 480;
        ImageGUI gui = new ImageGUI();
        gui.createWin("camera", new Dimension(frameWidth, frameHeight));
        Mat frame = new Mat();
        List<FaceDetectResDTO> faceDetectResDTOS = Lists.newLinkedList();
        int index = 0;
        while (true) {
            boolean have = capture.read(frame);
            // Win camera
            Core.flip(frame, frame, 1);
            if (!have) {
                break;
            }
            if (!frame.empty()) {
                BufferedImage bufferedImage = ImageUtil.mat2BufImg(frame, ".jpg");
                if (bufferedImage != null) {
                    //Convert video to pictures
                    long start = System.currentTimeMillis();
                    ImageInfo videoImageInfo = bufferedImage2ImageInfo(bufferedImage);
                    //Feature video image extraction
                    List<FaceInfo> videoFaceInfoList = faceEngineService.detectFaces(videoImageInfo);
                    List<ProcessInfo> process = faceEngineService.process(videoImageInfo, videoFaceInfoList);
                    if (videoFaceInfoList!=null && videoFaceInfoList.size() > 0) {
                        index++;
                        if (index >= 5) {
                            //Feature comparison
                            faceDetectResDTOS.clear();
                            for (int i = 0; i < videoFaceInfoList.size(); i++) {
                                FaceDetectResDTO faceDetectResDTO = new FaceDetectResDTO();
                                FaceInfo faceInfo = videoFaceInfoList.get(i);
                                faceDetectResDTO.setRect(faceInfo.getRect());
                                faceDetectResDTO.setFaceId(faceInfo.getFaceId());
                                faceDetectResDTO.setOrient(faceInfo.getOrient());
                                faceDetectResDTO.setWearGlasses(faceInfo.getWearGlasses());
                                faceDetectResDTO.setLeftEyeClosed(faceInfo.getLeftEyeClosed());
                                faceDetectResDTO.setRightEyeClosed(faceInfo.getRightEyeClosed());
                                faceDetectResDTO.setFaceShelter(faceInfo.getFaceShelter());
                                faceDetectResDTO.setFaceData(faceInfo.getFaceData());
                                if (CollectionUtil.isNotEmpty(process)) {
                                    ProcessInfo processInfo = process.get(i);
                                    faceDetectResDTO.setAge(processInfo.getAge());
                                    faceDetectResDTO.setGender(processInfo.getGender());
                                    faceDetectResDTO.setLiveness(processInfo.getLiveness());
                                }
                                byte[] feature = faceEngineService.extractFaceFeature(videoImageInfo, faceInfo);

                                if (feature != null) {
                                    List<UserCompareInfo> userCompareInfos = faceEngineService.faceRecognition(feature, UserRamCache.getUserList(), 0.8f);
                                    if (CollectionUtil.isNotEmpty(userCompareInfos)) {
                                        faceDetectResDTO.setName(userCompareInfos.get(0).getName());
                                        faceDetectResDTO.setSimilar(userCompareInfos.get(0).getSimilar());
                                        faceDetectResDTO.setFaceId(userCompareInfos.get(0).getFaceId());
                                    }
                                }
                                long end = System.currentTimeMillis();
                                System.out.println("Detection time:" + (end - start) + "ms");
                                faceDetectResDTOS.add(faceDetectResDTO);


                            }
                            //System.out.println(index);
                            index = 0;
                        }
                        //Mark face information
                        for (FaceDetectResDTO faceDetectResDTO : faceDetectResDTOS) {
                            int faceId = faceDetectResDTO.getFaceId();
                            String name = faceDetectResDTO.getName();
                            float score = faceDetectResDTO.getSimilar();
                            Rect rect = faceDetectResDTO.getRect();
                            int age = faceDetectResDTO.getAge();
                            int gender = faceDetectResDTO.getGender();
                            int liveness = faceDetectResDTO.getLiveness();
                            String genderStr = gender == 0 ? "male" : "female";
                            String livenessStr = (liveness == 1 ? "living thing" : "Non living body");
                            //Imgproc.putText(frame, name + " " + String.valueOf(score), new Point(rect.left, rect.top), 0, 1.0, new Scalar(0, 255, 0), 1, Imgproc.LINE_AA, false);
                            frame = ImageUtil.addMark(bufferedImage, videoFaceInfoList, faceId, name, score, rect, age, genderStr, livenessStr);
                        }
                    }
                    gui.imshow(ShowVideo.conver2Image(frame));
                    gui.repaint();
                }
                try {
                    Thread.sleep(100);
                } catch (InterruptedException e) {
                    e.printStackTrace();
                }
            }
        }
    }
}

Pay attention to modifying the jar package and dll address in the above source code
Run com itboyst. facedemo. Opencvtest can load the video or open the capture in the code The open (0) annotation can realize the real-time recognition of the camera. The above is all the functions of the source code. Further needs need to be explored by basin friends themselves. Here is a brick to attract jade
Source address:https://gitee.com/x55admin/ArcSoftFaceDemo

reference resources:

Topics: Java Android Maven Spring source code