Texas Instruments AM6x Developing Multiple Cameras
Nkọwapụta
- Aha ngwaahịa: AM6x ezinụlọ nke ngwaọrụ
- Supported Camera Type: AM62A (With or without built-in ISP), AM62P (With Built-in ISP)
- Data mmepụta igwefoto: AM62A (Raw/YUV/RGB), AM62P (YUV/RGB)
- ISP HWA: AM62A (Ee), AM62P (Mba)
- Ọmụmụ miri emi HWA: AM62A (Ee), AM62P (Mba)
- Eserese 3-D HWA: AM62A (Mba), AM62P (Ee)
Introduction to Multiple-Camera Applications on AM6x:
- Igwefoto agbakwunyere na-arụ ọrụ dị oke mkpa na sistemụ ọhụụ ọgbara ọhụrụ.
- Utilizing multiple cameras in a system enhances capabilities and enables tasks not achievable with a single camera.
Applications Using Multiple Cameras:
- Nchekwa onyunyo: Enhances surveillance coverage, object tracking, and recognition accuracy.
- Gburugburu View: Enables stereo vision for tasks like obstacle detection and object manipulation.
- Sistemụ ihe ndekọ ụlọ na igwefoto: Na-enye mkpuchi ogologo ma kpochapụ ntụpọ kpuru ìsì.
- Onyonyo ahụike: Offers enhanced precision in surgical navigation and endoscopy.
- Drones na onyonyo ikuku: Capture high-resolution images from different angles for various applications.
Connecting Multiple CSI-2 Cameras to the SoC:
To connect multiple CSI-2 cameras to the SoC, follow the guidelines provided in the user manual. Ensure proper alignment and connection of each camera to the designated ports on the SoC.
Ngwa ndetu
Na-emepe ngwa igwefoto ọtụtụ na AM6x
Jianzhong Xu, Qutaiba Saleh
ABSTRACT
This report describes application development using multiple CSI-2 cameras on the AM6x family of devices. A reference design of object detection with deep learning on 4 cameras on the AM62A SoC is presented with performance analysis. General principles of the design apply to other SoCs with a CSI-2 interface, such as AM62x and AM62P.
Okwu mmalite
Igwefoto agbakwunyere na-arụ ọrụ dị mkpa na usoro ọhụụ nke oge a. Iji ọtụtụ igwefoto na sistemụ na-agbasawanye ikike nke sistemu ndị a ma na-enye ike nke na-agaghị ekwe omume na otu igwefoto. N'okpuru bụ ụfọdụ examples of applications using multiple embedded cameras:
- Nchekwa onyunyo: Ọtụtụ igwefoto etinyere na usoro na-enye mkpuchi onyunyo zuru oke. Ha na-enyere aka panoramic views, reduce blind spots, and enhance the accuracy of object tracking and recognition, improving overall security measures.
- Gburugburu View: Multiple cameras are used to create a stereo vision setup, enabling three-dimensional information and the estimation of depth. This is crucial for tasks such as obstacle detection in autonomous vehicles, precise object manipulation in robotics, and enhanced realism of augmented reality experiences.
- Cabin Recorder and Camera Mirror System: A car cabin recorder with multiple cameras can provide more coverage using a single processor. Similarly, a camera mirror system with two or more cameras can expand the driver’s field of view ma kpochapụ ntụpọ ndị kpuru ìsì n'akụkụ niile nke ụgbọ ala.
- Medical Imaging: Multiple cameras can be used in medical imaging for tasks like surgical navigation, providing surgeons with multiple perspectives for enhanced precision. In endoscopy, multiple cameras enable a thorough examination of internal organs.
- Drones and Aerial Imaging: Drones often come equipped with multiple cameras to capture high-resolution images or videos from different angles. This is useful in applications like aerial photography, agriculture monitoring, and land surveying.
- With the advancement of microprocessors, multiple cameras can be integrated into a single System-on-Chip.
(SoC) to provide compact and efficient solutions. The AM62Ax SoC, with high-performance video/vision processing and deep learning acceleration, is an ideal device for the above-mentioned use cases. Another AM6x device, the AM62P, is built for high-performance embedded 3D display applications. Equipped with 3D graphics acceleration, the AM62P can easily stitch together the images from multiple cameras and produce a high-resolution panoramic view. Edepụtala atụmatụ ọhụrụ nke AM62A/AM62P SoC n'akwụkwọ dị iche iche, dị ka [4], [5], [6], wdg. Ngwa ndetu ngwa a agaghị ekwugharị nkọwa njirimara ndị ahụ kama ọ na-elekwasị anya na ijikọ ọtụtụ igwefoto CSI-2 n'ime ngwa ọhụụ agbakwunyere na AM62A/AM62P. - Tebụl 1-1 na-egosi ọdịiche dị n'etiti AM62A na AM62P n'ihe gbasara nhazi onyonyo.
Isiokwu 1-1. Ọdịiche dị n'etiti AM62A na AM62P na nhazi onyonyo
SoC | AM62A | AM62P |
Ụdị igwefoto akwadoro | With or without a built-in ISP | Site na ISP arụnyere |
Data mmepụta igwefoto | Raw/YUV/RGB | YUV/RGB |
ISP HWA | Ee | Mba |
Mmụta miri emi HWA | Ee | Mba |
Ihe eserese 3-D HWA | Mba | Ee |
Ijikọ ọtụtụ igwefoto CSI-2 na SoC
Igwefoto nke dị na AM6x SoC nwere ihe ndị a, dịka egosiri na eserese 2-1:
- MIPI D-PHY Receiver: receives video streams from external cameras, supporting up to 1.5 Gbps per data lane for 4 lanes.
- CSI-2 Receiver (RX): receives video streams from the D-PHY receiver and either directly sends the streams to the ISP or dumps the data to DDR memory. This module supports up to 16 virtual channels.
- SHIM: a DMA wrapper that enables sending the captured streams to memory over DMA. Multiple DMA contexts can be created by this wrapper, with each context corresponding to a virtual channel of the CSI-2 Receiver.
Multiple cameras can be supported on the AM6x through the use of virtual channels of CSI-2 RX, even though there is only one CSI-2 RX interface on the SoC. An external CSI-2 aggregating component is needed to combine multiple camera streams and send them to a single SoC. Two types of CSI-2 aggregating solutions can be used, described in the following sections.
CSI-2 Aggregator Using SerDes
Otu ụzọ isi jikọta ọtụtụ iyi igwefoto bụ iji usoro serializing na deserializing (SerDes) ngwọta. A na-atụgharị data CSI-2 sitere na igwefoto ọ bụla site na serializer wee bufee site na eriri. Deserializer na-enweta data serialized niile sitere na eriri (otu eriri n'otu igwefoto), na-atụgharị iyi ndị ahụ na data CSI-2, wee zipụ iyi CSI-2 nke etinyere na otu CSI-2 RX interface na SoC. A na-amata iyi iyi igwefoto ọ bụla site na ọwa mebere pụrụ iche. Ngwọta nchịkọta a na-enye abamuru ọzọ nke ikwe ka njikọ dị anya ruru 15m site na igwefoto gaa na SoC.
Ndị FPD-Link ma ọ bụ V3-Link serializers na deserializers (SerDes), na-akwado na AM6x Linux SDK, bụ teknụzụ kachasị ewu ewu maka ụdị nchịkọta CSI-2 a. Ma FPD-Link na V3-Link deserializers nwere ọwa azụ nke enwere ike iji zipu akara mmekọrịta etiti iji mekọrịta igwefoto niile, dịka akọwara na [7].
Ọgụgụ 2-2 na-egosi exampnke iji SerDes jikọọ ọtụtụ igwefoto na otu AM6x SoC.
Onye bụbuample of this aggregating solution can be found in the Arducam V3Link Camera Solution Kit. This kit has a deserializer hub which aggregates 4 CSI-2 camera streams, as well as 4 pairs of V3link serializers and IMX219 cameras, including FAKRA coaxial cables and 22-pin FPC cables. The reference design discussed later is built on this kit.
CSI-2 Aggregator without Using SerDes
Ụdị nchịkọta a nwere ike iji ọtụtụ igwefoto MIPI CSI-2 jikọọ ozugbo wee chịkọta data sitere na igwefoto niile gaa na otu iyi mmepụta CSI-2.
Ọgụgụ 2-3 na-egosi example of such a system. This type of aggregating solution does not use any serializer/deserializer but is limited by the maximum distance of CSI-2 data transfer, which is up to 30cm. The AM6x Linux SDK does not support this type of CSI-2 aggregator
Na-enyere ọtụtụ igwefoto na ngwanrọ
Camera Subsystem Software Architecture
Ọgụgụ 3-1 na-egosi eserese ngọngọ ọkwa dị elu nke sọftụwia sistemu igwefoto na AM62A/AM62P Linux SDK, dabara na sistemụ HW na eserese 2-2.
- Nhazi ngwanrọ a na-enyere ndị SoC aka ịnata ọtụtụ iyi igwefoto site na iji SerDes, dịka egosiri na eserese 2-2. FPD-Link/V3-Link SerDes na-ekenye adreesị I2C pụrụ iche na ọwa mebere igwefoto ọ bụla. Ekwesịrị iji adreesị I2C pụrụ iche mepụta ihe mkpuchi osisi ngwaọrụ pụrụ iche maka igwefoto ọ bụla. Onye ọkwọ ụgbọ ala CSI-2 RX na-amata igwefoto ọ bụla site na iji nọmba ọwa mebere pụrụ iche wee mepụta ọnọdụ DMA maka iyi igwefoto ọ bụla. A na-emepụta ọnụ vidiyo maka ọnọdụ DMA ọ bụla. A na-anata ma chekwaa data sitere na igwefoto ọ bụla site na iji DMA na ebe nchekwa ya. Ngwa oghere onye ọrụ na-eji ọnụ vidiyo dabara na igwefoto ọ bụla iji nweta data igwefoto. Ọpụamples of using this software architecture are given in Chapter 4 – Reference Design.
- Ọkwọ ụgbọala ọ bụla akọwapụtara nke dabara na usoro V4L2 nwere ike ikwunye ma kpọọ n'ime ụlọ a. Rụtụ aka na [8] gbasara otu esi ejikọta onye ọkwọ ụgbọ ala ọhụrụ na Linux SDK.
Image Pipeline Software Architecture
- The AM6x Linux SDK provides the GStreamer (GST) framework, which can be used in the ser space to integrate the image processing components for various applications. The Hardware Accelerators (HWA) on the SoC, such as the Vision Pre-processing Accelerator (VPAC) or ISP, video encoder/decoder, and deep learning compute engine, are accessed through GST plugins. VPAC (ISP) n'onwe ya nwere ọtụtụ ngọngọ, gụnyere Ọhụụ Imaging Sub-System (VISS), Lens Distortion Correction (LDC), na Multiscalar (MSC), nke ọ bụla dabara na ngwa mgbakwunye GST.
- Figure 3-2 shows the block diagram of a typical image pipeline from the camera to encoding or deep
learning applications on AM62A. For more details about the end-to-end data flow, refer to the EdgeAI SDK documentation.
For AM62P, the image pipeline is simpler because there is no ISP on AM62P.
Site na oghere vidiyo emepụtara maka igwefoto ọ bụla, pipeline onyonyo dabere na GStreamer na-enye ohere nhazi nke ntinye igwefoto dị iche iche (jikọrọ site na otu CSI-2 RX interface) n'otu oge. Enyere atụmatụ ntụaka site na iji GStreamer maka ngwa igwefoto ọtụtụ n'isiakwụkwọ na-esote.
Atụmatụ ntụaka
Isiakwụkwọ a na-egosi nrụtụ aka nke na-agba ọsọ ngwa ngwa igwefoto na AM62A EVM, na-eji Arducam V3Link Camera Solution Kit iji jikọọ igwefoto 4 CSI-2 na AM62A na nchọpụta ihe na-agba ọsọ maka igwefoto 4 niile.
Igwefoto akwadoro
The Arducam V3Link kit works with both FPD-Link/V3-Link-based cameras and Raspberry Pi-compatible CSI-2 cameras. The following cameras have been tested:
- D3 Engineering D3RCM-IMX390-953
- Leopard Imaging LI-OV2312-FPDLINKIII-110H
- IMX219 cameras in the Arducam V3Link Camera Solution Kit
Setting up Four IMX219 Cameras
Follow the instructions provided in the AM62A Starter Kit EVM Quick Start Guide to set up the SK-AM62A-LP EVM (AM62A SK) and ArduCam V3Link Camera Solution Quick Start Guide to connect the cameras to AM62A SK through the V3Link kit. Make sure the pins on the flex cables, cameras, V3Link board, and AM62A SK are all aligned properly.
Figure 4-1 shows the setup used for the reference design in this report. The main components in the setup include:
- 1X SK-AM62A-LP EVM board
- 1X Arducam V3Link d-ch adapter board
- FPC cable connecting Arducam V3Link to SK-AM62A
- 4X V3Link camera adapters (serializers)
- 4X RF coaxial cables to connect V3Link serializers to V3Link d-ch kit
- 4X IMX219 Cameras
- 4X CSI-2 22-pin cables to connect cameras to serializers
- Cables: HDMI cable, USB-C to power SK-AM62A-LP and 12V power sourced for V3Link d-ch kit)
- Other components not shown in Figure 4-1: micro-SD card, micro-USB cable to access SK-AM62A-LP, and Ethernet for streaming
Configuring Cameras and CSI-2 RX Interface
Set up the software according to the instructions provided in the Arducam V3Link Quick Start Guide. After running the camera setup script, setup-imx219.sh, the camera’s format, the CSI-2 RX interface format, and the routes from each camera to the corresponding video node will be configured properly. Four video nodes are created for the four IMX219 cameras. Command “v4l2-ctl –list-devices” displays all the V4L2 video devices, as shown below:
There are 6 video nodes and 1 media node under tiscsi2rx. Each video node corresponds to a DMA context allocated by the CSI2 RX driver. Out of the 6 video nodes, 4 are used for the 4 IMX219 cameras, as shown in the media pipe topology below:
Dị ka egosiri n'elu, ụlọ ọrụ mgbasa ozi 30102000.ticsi2rx nwere 6 isi iyi, ma ọ bụ naanị 4 mbụ ka a na-eji, nke ọ bụla maka otu IMX219. E nwekwara ike ịpụta ihe osise ọkpọkọ mgbasa ozi. Gbaa iwu a ka ịmepụta ntụpọ file:
Then run the command below on a Linux host PC to generate a PNG file:
Ọgụgụ 4-2 bụ eserese ewepụtara site na iji iwu enyere n'elu. Enwere ike ịhụ ihe ndị dị na nhazi ngwanrọ nke eserese 3-1 na eserese a.
Streaming from Four Cameras
N'ịbụ nke edobere ma ngwaike na ngwanrọ nke ọma, ngwa igwefoto nwere ike ịgba ọsọ site na oghere onye ọrụ. Maka AM62A, ISP ga-edoberịrị iji mepụta ọmarịcha onyonyo. Rụtụ aka na AM6xA ISP Tuning Guide maka otu esi eme ntugharị ISP. Akụkụ ndị a na-egosi examples nke ịkwanye data igwefoto na ngosipụta, na-ebunye data igwefoto na netwọkụ, yana ịchekwa data igwefoto ka ọ bụrụ ihe ngosi. files.
Streaming Camera Data to Display
Ngwa bụ isi nke sistemụ igwefoto ọtụtụ a bụ ịkwanye vidiyo site na igwefoto niile gaa na ngosipụta ejikọrọ na otu SoC. Nke a bụ pipeline GStreamer example nke nkwanye IMX219 anọ na ngosipụta (nọmba ọnụ vidiyo na ọnụọgụ v4l-subdev na pipeline ga-agbanwe site na ịmalitegharị na ịmalitegharị).
Streaming Camera Data through Ethernet
Kama ịkwanye na ngosipụta ejikọrọ na otu SoC, data igwefoto nwekwara ike ịgbanye site na Ethernet. Akụkụ nnabata nwere ike ịbụ AM62A/AM62P processor ọzọ ma ọ bụ PC nnabata. Ihe na-esonụ bụ example nke ịkwanye data igwefoto site na Ethernet (iji igwefoto abụọ maka ịdị mfe) (rịba ama ngwa mgbakwunye koodu eji na pipeline):
Ihe na-esonụ bụ exampnke ịnata data igwefoto na ịkwanye na ngosi na ihe nrụpụta AM62A/AM62P ọzọ:
Storing Camera Data to Files
Instead of streaming to a display or through a network, the camera data can be stored in local files. Pipeline dị n'okpuru na-echekwa data igwefoto ọ bụla na a file (iji igwefoto abụọ dị ka example maka mfe).
Multicamera Deep Learning Inference
AM62A nwere ngwa ngwa mmụta miri emi (C7x-MMA) nwere ihe ruru abụọ TOPS, nke nwere ike ịgba ụdị ụdị mmụta miri emi dị iche iche maka nhazi ọkwa, nchọpụta ihe, ngalaba semantic, na ndị ọzọ. Akụkụ a na-egosi otú AM62A nwere ike isi na-agba ọsọ ụdị mmụta miri emi anọ na ntanetịime igwefoto anọ dị iche iche n'otu oge.
Nhọrọ nlereanya
The TI’s EdgeAI-ModelZoo provides hundreds of state-of-the-art models, which are converted/exported from their original training frameworks to an anembedded-friendlyy format so that they can be offloaded to the C7x-MMA deep learning accelerator. The cloud-based Edge AI Studio Model Analyzer provides an easy-to-use “Model Selection” tool. It is dynamically updated to include all models supported in TI EdgeAI-ModelZoo. The tool requires no previous experience and provides an easy-to-use interface to enter the features required in the desired model.
The TFL-OD-2000-ssd-mobV1-coco-mlperf was selected for this multi-camera deep learning experiment. This multi-object detection model is developed in the TensorFlow framework with a 300×300 input resolution. Table 4-1 shows the important features of this model when trained on the cCOCO dataset with about 80 different classes.
Isiokwu 4-1. Mepụta atụmatụ nke Model TFL-OD-2000-ssd-mobV1-coco-mlperf.
Nlereanya | Ọrụ | Mkpebi | FPS | mAP 50% Accuracy On COCO | Latency/Frame (ms) | DDR BW Utilization (MB/ Frame) |
TFL-OD-2000-ssd- mobV1-coco-mlperf | Multi Object Detection | 300×300 | ~152 | 15.9 | 6.5 | 18.839 |
Pipeline Setup
Figure 4-3 shows the 4-camera deep learning GStreamer pipeline. TI provides a suite of GStreamer plugins nke na-enye ohere iwepu ụfọdụ nhazi mgbasa ozi yana ntinye mmụta miri emi na ngwaike ngwaike. Ụfọdụ examples nke ndị a plugins gụnyere tiovxisp, tiovxmultiscaler, tiovxmosaic, na tidlinferer. Pipeline dị na eserese 4-3 gụnyere ihe niile achọrọ plugins for a multipath GStreamer pipeline for 4-camera inputs, each with media preprocess, deep learning inference, and postprocess. The duplicated plugins n'ihi na nke ọ bụla n'ime igwefoto ụzọ na-tokọrọ na graph maka mfe ngosi.
The available hardware resources are evenly distributed among the four camera paths. For instance, AM62A contains two image multiscalers: MSC0 and MSC1. The pipeline explicitly dedicates MSC0 to process camera 1 and camera 2 paths, while MSC1 is dedicated to camera 3 and camera 4.
The output of the four camera pipelines is scaled down and concatenated together using the tiovxmosaic plugin. The output is displayed on a single screen. Figure 4-4 shows the output of the four cameras with a deep learning model running object detection. Each pipeline (camera) is running at 30 FPS and a total of 120 FPS.
Nke na-esote bụ edemede pipeline zuru oke maka ikpe ojiji mmụta miri emi nke igwefoto multicamera egosiri na eserese 4-3.
Nyocha arụmọrụ
The setup with four cameras using the V3Link board and the AM62A SK was tested in various application scenarios, including directly displaying on a screen, streaming over Ethernet (four UDP channels), recording to 4 separate files, and with deep learning inference. In each experiment, we monitored the frame rate and the utilization of CPU cores to explore the whole system’s capabilities.
Dịka egosiri na mbụ na eserese 4-4, pipeline mmụta miri emi na-eji ngwa mgbakwunye tiperfoverlay GStreamer gosi ibu CPU dị ka eserese mmanya na ala ihuenyo ahụ. Site na ndabara, a na-emelite eserese ahụ kwa sekọnd abụọ ọ bụla iji gosi ibu dị ka pasentị ojijitage. In addition to the tiperfoverlay GStreamer plugin, the perf_stats tool is a second option to show core performance directly on the terminal with an option for saving to a file. This tool is more accurate compared to the tTiperfoverlayas the latter adds extra load on theARMm cores and the DDR to draw the graph and overlay it on the screen. The perf_stats tool is mainly used to collect hardware utilization results in all of the test cases shown in this document. Some of the important processing cores and accelerators studied in these tests include the main processors (four A53 Arm cores @ 1.25GHz), the deep learning accelerator (C7x-MMA @ 850MHz), the VPAC (ISP) with VISS and multiscalers (MSC0 and MSC1), and DDR operations.
Table 5-1 shows the performance and resource utilization when using AM62A with four cameras for three use cases, including streaming four cameras to a display, streaming over Ethernet, and recording to four separate files. Two tests are implemented in each use case: with the camera only and with deep learning inference. In addition, the first row in Table 5-1 shows hardware utilizations when only the operating system was running on AM62A without any user applications. This is used as a baseline to compare against when evaluating hardware utilizations of the other test cases. As shown in the table, the four cameras with deep learning and screen display operated at 30 FPS each ,with a total of 120 FPS for the four cameras. This high frame rate is achieved with only 86% of the deep learning accelerator (C7x-MMA) full capacity. In addition, it is important to note that the deep learning accelerator was clocked at 850MHz instead of 1000MHz in these experiments, which is about only 85% of its maximum performance.
Isiokwu 5-1. Arụmọrụ (FPS) na ojiji akụrụngwa nke AM62A mgbe ejiri ya na 4 IMX219 Igwefoto maka Ngosipụta ihuenyo, Ethernet Stream, Dekọọ ka Files, na Ịme Ntụle Ọmụmụ miri emi
Applicatio n | Pipeline (operation ) | Mpụta | FPS avg pipeline s | FPS ngụkọta | MPUs A53s @ 1.25 GHz [%] | MCU R5 [%] | DLA (C7x- MMA) @ 850 MHz [%] | VISS [%] | MSC0 [%] | MSC1 [%] | DDR Rd [MB/s] | DDR Wr [MB/s] | DDR Total [MB/s] |
Enweghị ngwa | Baseline No operation | NA | NA | NA | 1.87 | 1 | 0 | 0 | 0 | 0 | 560 | 19 | 579 |
Igwefoto naanị | iyi to Screen | Ihuenyo | 30 | 120 | 12 | 12 | 0 | 70 | 61 | 60 | 1015 | 757 | 1782 |
Stream over Ethernet | UDP: 4 ports 1920×1080 | 30 | 120 | 23 | 6 | 0 | 70 | 0 | 0 | 2071 | 1390 | 3461 | |
Dekọọ ka files | 4 files 1920×1080 | 30 | 120 | 25 | 3 | 0 | 70 | 0 | 0 | 2100 | 1403 | 3503 | |
Cam with Deep learning | Deep learning: Object detection MobV1- coco | Ihuenyo | 30 | 120 | 38 | 25 | 86 | 71 | 85 | 82 | 2926 | 1676 | 4602 |
Deep learning: Object detection MobV1- coco and Stream over Ethernet | UDP: 4 ports 1920×1080 | 28 | 112 | 84 | 20 | 99 | 66 | 65 | 72 | 4157 | 2563 | 6720 | |
Deep learning: Object detection MobV1- coco and record to files | 4 files 1920×1080 | 28 | 112 | 87 | 22 | 98 | 75 | 82 | 61 | 2024 | 2458 | 6482 |
Nchịkọta
This application report describes how to implement multi-camera applications on the AM6x family of devices. A reference design based on Arducam’s V3Link Camera Solution Kit and AM62A SK EVM is provided in the report, with several camera applications using four IMX219 cameras, such as streaming and object detection. Users are encouraged to acquire the V3Link Camera Solution Kit from Arducam and replicate these examples. The report also provides a detailed analysis of the performance of AM62A while using four cameras under various configurations, including displaying to a screen, streaming over Ethernet, and recording to files. It also showsAM62A’sA capability of performing deep learning inference on four separate camera streams in parallel. If there are any questions about running these examples, nyefee ajụjụ na TI E2E forum.
Ntụaka
- AM62A Starter Kit EVM Quick Start Guide
- ArduCam V3Link Camera Solution Quick Start Guide
- Edge AI SDK documentation for AM62A
- Edge AI Smart Cameras Using Energy-Efficient AM62A Processor
- Camera Mirror Systems on AM62A
- Driver and Occupancy Monitoring Systems on AM62A
- Quad Channel Camera Application for Surround View and CMS Camera Systems
- AM62Ax Linux Academy on Enabling CIS-2 Sensor
- Edge AI ModelZoo
- Edge AI Studio
- Perf_stats tool
TI Parts Referred in This Application Note:
- https://www.ti.com/product/AM62A7
- https://www.ti.com/product/AM62A7-Q1
- https://www.ti.com/product/AM62A3
- https://www.ti.com/product/AM62A3-Q1
- https://www.ti.com/product/AM62P
- https://www.ti.com/product/AM62P-Q1
- https://www.ti.com/product/DS90UB960-Q1
- https://www.ti.com/product/DS90UB953-Q1
- https://www.ti.com/product/TDES960
- https://www.ti.com/product/TSER953
Ọkwa dị mkpa na nkwupụta
TI na-enye teknuzu na ntụkwasị obi data (gụnyere mpempe akwụkwọ data), ihe eji eme ihe (gụnyere atụmatụ ntụnye aka), ntinye akwụkwọ ma ọ bụ ndụmọdụ atụmatụ ndị ọzọ, WEB Ngwa, ozi nchekwa, na akụrụngwa ndị ọzọ “Dịka ọ dị” yana mmejọ niile, ma na-ekwupụta ikike niile, KWESỊRỊ NA KWESỊRỊ, agụnyere na-enweghị oke ọ bụla akwadoro akwadoro maka ịzụ ahịa, ihe kwesịrị ekwesị maka ịre ahịa. Ikike ọgụgụ isi .
Ezubere akụrụngwa ndị a maka ndị mmepe nwere nkà na-eji ngwaahịa TI emebe. Ọ bụ naanị gị na-ahụ maka ya
- ịhọrọ ngwaahịa TI kwesịrị ekwesị maka ngwa gị,
- imebe, nkwado na ịnwale ngwa gị, yana
- ensuring your application meets applicable standards, and any other safety, security, regulatory, or other requirements.
These resources are subject to change without notice. TI permits you to use these resources only for the development of an application that uses the TI products described in the resource. Other reproduction and display of these resources is prohibited. No license is granted to any other TI intellectual property right or to any third party intellectual property right. TI disclaims responsibility for, and you will fully indemnify TI and its representatives against, any claims, damages, costs, losses, and liabilities arising out of your use of these resources.
A na-enye ngwaahịa TI n'okpuru Usoro ire ere ma ọ bụ usoro ndị ọzọ dị na ya ti.com ma ọ bụ nyere ya na ngwaahịa TI dị otú ahụ. Ndokwa TI nke akụrụngwa ndị a anaghị agbasa ma ọ bụ gbanwee akwụkwọ ikike TI dị na ya ma ọ bụ nkwupụta akwụkwọ ikike maka ngwaahịa TI.
TI jụrụ ma jụ usoro mgbakwunye ọ bụla ma ọ bụ dị iche iche ị nwere ike chepụta.
Ọkwa dị mkpa
- Adreesị nzipu ozi: Texas Instruments, Igbe nzi ozi 655303, Dallas, Texas 75265
- Nwebiisinka © 2024, Texas Instruments Incorporated
Ajụjụ a na-ajụkarị
Ajụjụ: Enwere m ike iji ụdị igwefoto ọ bụla na ezinụlọ AM6x?
The AM6x family supports different camera types, including those with or without built-in ISP. Refer to the specifications for more details on supported camera types.
: What are the main differences between AM62A and AM62P in image processing?
The key variations include supported camera types, camera output data, presence of ISP HWA, Deep Learning HWA, and 3-D Graphics HWA. Refer to the specifications section for a detailed comparison.
Akwụkwọ / akụrụngwa
![]() | Ngwa Texas AM6x na-emepe ọtụtụ igwefoto [pdf] Ntuziaka onye ọrụ AM62A, AM62P, AM6x Na-emepụta Igwefoto Ọtụtụ, AM6x, Na-emepe Otu Igwefoto, Otutu Igwefoto, Igwefoto |