The Embedded Vision Systems (EVS) group works within the Computer Vision Laboratory, Department of Automatic Control and Robotics, Faculty of Electrical Engineering, Automatics, Computer Science and Biomedical Engineering (EAIiIB), AGH University of Science and Technology in Krakow, Poland.
We are involved in research on implementation of various types of vision systems, which can operate in real time even for high-resolution video streams and sampling – up to UHD (3840 x 2160 @ 60 fps) – while maintaining low power consumption. As hardware platforms, we use reprogrammable FPGAs (Field Programmable Gate Array), reprogrammable SoCs (System on Chip) such as Zynq SoC, Zynq UltraScale+ MPSoC (Multi Processor System on Chip), or embedded GPU (Graphic Processing Unit) solutions such as Nvidia’s Jetson series. We are also interested in neuromorphic platforms.
Such systems are used in applications requiring very fast and efficient processing of video data so that the information obtained from them can be used almost immediately to support decision-making or control. Current research focuses on the following areas:
– control of autonomous, unmanned flying vehicles (drones, UAVs, UAS),
– vision algorithms for autonomous vehicles (self-driving cars) and Advanced Driver Assistance Systems (ADAS),
– object tracking and trajectory determination,
– use of LiDAR data for object detection and environment mapping,
– segmentation of foreground and moving objects, as well as elements of advanced video surveillance systems (AVSS – Advanced Video Surveillance Systems),
– implementation of embedded AI methods – deep convolutional neural networks and spiking neural networks.
Since 2011, members of the group have been co-organizers of the international conference DASIP (The Conference on Design and Architectures for Signal and Image Processing), cooperating with scientists from Germany, France, Italy, Spain, Portugal and Canada. We also present our research at the ARC (International Symposium on Applied Reconfigurable Computing), ICCVG (International Conference on Computer Vision and Graphics), SPA (Signal Processing Algorithms, Architectures, Arrangements and Applications), KKA (National Conference on Automation), FedCSIS (Federated Conference on Computer Science and Information Systems) conferences and in Journal of Real-Time Image Processing, Journal of Signal Processing and Sensors.
The team members teach in the following courses: Digital Image Processing and Vision Systems (Polish and English), Reconfigurable Systems, Structured and Object-Oriented Programming, Advanced Object-Oriented Programming, Algorithms and Data Structures, Architectures of Vision Systems, Image Compression and Coding Methods, Software-Hardware Implementation of Algorithms (Hardware-Software Vision Systems), Perception Systems and Algorithms in Autonomous Vehicles, Parallel Computing in CUDA, Biometrics, and HLS tools. Every year, several bachelor and master theses are defended as a result of cooperation between staff and students. In addition, our team closely cooperates with the AVADER Student Research Group, which brings together students interested in embedded vision systems, drones and autonomous vehicles.
TEAM MEMBERS
I specialise in computer vision architectures, with particular emphasis on real-time systems, FPGA reconfigurable devices, embedded and intelligent systems. More broadly, my interests include many issues in the areas of automation and robotics, electronics and applied computer science. I am the author of 120 scientific publications.
Tomasz Kryjak
I am interested in embedded vision systems for autonomous vehicles, drones and advanced automated video surveillance. I implement these in FPGAs and reprogrammable heterogeneous devices. I also work with event cameras, neuromorphic computing and embedded AI systems.
Zbigniew Bubliński
I am interested in optimization of algorithms for digital image processing and analysis and embedded vision systems. I teach courses on vision systems and embedded vision systems.
Piotr Pawlik
I am interested in image analysis and processing with a particular focus on the topic of feature points. I teach courses in computer science and vision systems.
Marcin Kowalczyk
My interests are focused on real-time vision systems for autonomous robots. In my research, I use heterogeneous computing platforms and high-level environments enabling the modelling of algorithms and designed systems. My research also concerns the use of neuromorphic event sensors in modern vision data processing systems that require the development of new data processing methods.
Hubert Szolc
I am interested in the control of autonomous vehicles based primarily on visual information. I use FPGAs and heterogeneous computing platforms for hardware implementation of the algorithms.
Mateusz Wąsala
I am interested in embedded vision systems, in particular SLAM algorithms for ambient map generation and positioning of unmanned aerial vehicles. I also design and build equipment for these vehicles.
Dominika Przewłocka-Rus
I am interested in artificial intelligence solutions, in particular the use of neural networks in embedded vision systems. I deal with implementation of such algorithms in FPGAs and reprogrammable heterogeneous computing devices.
Kamil Jeziorek
I am interested in vision systems that use event-driven cameras and methods based on deep neural networks. I am working to explore the potential of implementing techniques such as Graph Convolutional Networks and Vision Transformers on various computing platforms, with a particular focus on SoC FPGAs.
- Blachut K., Danilowicz, M., Szolc, H., Wasala, M., Kryjak, T., Komorkiewicz, M. (2022): Automotive perception system evaluation with reference data from a UAV’s camera using ArUco markers and DCNN, Journal of Signal Processing Systems.
- Stanisz J. Lis K., Gorgon, M. (2021): Implementation of the PointPillars network for 3D object detection in reprogrammable heterogeneous devices using FINN, Journal of Signal Processing Systems.
- Kowalczyk, M., Kryjak, T. (2021): A comparison of real-time 4K/UltraHD connected component labelling architectures, 31st International Conference on Field-Programmable Logic and Applications (FPL), Demo night.
- Przewłocka-Rus, D., Kryjak T. (2021): Quantised Siamese Tracker for 4K/UltraHD Video Stream–a demo, 31st International Conference on Field-Programmable Logic and Applications (FPL), Demo night.
- Stanisz, J., Lis, K., Kryjak, T., Gorgon, M. (2021): Hardware-software implementation of a DNN for 3D object detection using FINN-a demo, 31st International Conference on Field-Programmable Logic and Applications (FPL), Demo night.
- Wzorek, P., Kryjak, T. (2021): Training dataset generation for automatic registration of a duplicate bridge game, Zeszyty Studenckiego Towarzystwa Naukowego, ISSN 1732-0925
- Kowalczyk, M., Kryjak, T. (2021): A Connected Component Labelling algorithm for multi-pixel per clock cycle video stream, 24th Euromicro Conference on Digital System Design (DSD)
- Cyba, A., Szolc, H. Kryjak, T. (2021): A simple vision-based navigation and control strategy for autonomous drone racing, 25th International Conference on Methods and Models in Automation and Robotics (MMAR)
- Przewłocka-Rus, D., Kryjak T. (2021): The bioinspired traffic sign classifier, International Conference Cybernetic Modelling of Biological Systems : MCSB 2021 : Kraków (Poland)
- Przewłocka-Rus, D., Kowalczyk, M., Kryjak, T. (2021): Exploration of Hardware Acceleration Methods for an XNOR Traffic Signs Classifier, In: Choraś M., Choraś R.S., Kurzyński M., Trajdos P., Pejaś J., Hyla T.(eds) Progress in Image Processing, Pattern Recognition and Communication Systems. CORES 2021, IP&C 2021, ACS 2021. Lecture Notes in Networks and Systems, vol 255. Springer, Cham
- Kowalczyk, M., Ciarach, P., Przewlocka-Rus, D. Szolc H., Kryjak T. (2021): Real-Time FPGA Implementation of Parallel Connected Component Labelling for a 4K Video Stream, Journal of Signal Processing Systems, Springer
- Blachut K., Danilowicz, M., Szolc, H., Wasala, M., Kryjak, T., Pankiewicz, N. Komorkiewicz, M. (2021), Automotive perception system evaluation with reference data obtained by a UAV, DASIP ’21: Workshop on Design and Architectures for Signal and Image Processing (14th edition), 20 January 2021, Budapest, Hungary.
- Stanisz, J., Lis, K., Kryjak, T., Gorgon, M. (2021): Hardware-software implementation of the PointPillars network for 3D object detection in point clouds, DASIP ’21: Workshop on Design and Architectures for Signal and Image Processing (14th edition), 20 January 2021, Budapest, Hungary.
- Janus P., Kryjak T., Gorgon M. (2020): Foreground Object Segmentation in RGB–D Data Implemented on GPU, Advanced, Contemporary Control : Proceedings of KKA 2020 – the 20th Polish Control Conference, 14-16 October, 2020, Łódź, Poland
- Kucharski, D., Kleczek, P., Jaworek-Korjakowska, J., Dyduch, G., & Gorgon, M. (2020): Semi-supervised nests of melanocytes segmentation method using convolutional autoencoders. Sensors (Switzerland), 20(6).
- Kleczek, P., Jaworek-Korjakowska, J., & Gorgon, M. (2020): A novel method for tissue segmentation in high-resolution H&E-stained histopathological whole-slide images. Computerized Medical Imaging and Graphics, 79.
- Ciarach, P., Kowalczyk, M., Przewlocka, D., & Kryjak, T. (2019): Real-Time FPGA Implementation of Connected Component Labelling for a 4K Video Stream. In C. Hochberger, B. Nelson, A. Koch, R. Woods, & P. Diniz (Eds.), Applied Reconfigurable Computing (pp. 165–180). Cham: Springer International Publishing.
- Przewlocka, D., Kowalczyk, M., & Kryjak, T. (2019): XNOR CNNs in FPGA: real-time detection and classification of traffic signs in 4K – a demo. Conference on Design and Architectures for Signal and Image Processing, DASIP.
- Kowalczyk, M., Przewlocka, D., & Kryjak, T. (2019): Real-time implementation of adaptive correlation filter tracking for 4K video stream – a demo. Conference on Design and Architectures for Signal and Image Processing, DASIP.
- Stanisz, J., Lis, K., Kryjak, T., & Gorgon, M. (2019): Hardware-software implementation of car detection system based on LiDAR sensor data – a demo. Conference on Design and Architectures for Signal and Image Processing, DASIP.
- Radwan, K., & Kryjak, T. (2019): Hardware implementation of the SURF feature detector for 4K 4PPC video stream – a demo. Conference on Design and Architectures for Signal and Image Processing, DASIP.
- Jaworek-Korjakowska, J., Kleczek, P., & Gorgon, M. (2019): Melanoma thickness prediction based on convolutional neural network with VGG-19 model transfer learning. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2019-June, 2748–2756.
- Piszczek, K., Janus, P., & Kryjak, T. (2018): The use of HACP+SBT lossless compression in optimizing memory bandwidth requirement for hardware implementation of background modelling algorithms. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 10824 LNCS, 379–391.
- Kowalczyk, M., Przewlocka, D., & Kryjak, T. (2018): Real-Time Implementation of Contextual Image Processing Operations for 4K Video Stream in Zynq UltraScale+ MPSoC. Conference on Design and Architectures for Signal and Image Processing, DASIP, 2018-October, 37–42.
- Kryjak, T., Komorkiewicz, M., & Gorgon, M. (2018): Real-time hardware–software embedded vision system for ITS smart camera implemented in Zynq SoC. Journal of Real-Time Image Processing, 15(1), 123–159.
- Radwan, K., Kryjak, T., & Gorgon, M. (2018): Hardware – software implementation of a SFM module for navigation an unmanned aerial vehicles – a demo. Conference on Design and Architectures for Signal and Image Processing, DASIP.
- Janus, P., & Kryjak, T. (2018): Hardware implementation of the Gaussian mixture model foreground object segmentation algorithm working with ultra-high resolution video stream in real-time. Signal Processing – Algorithms, Architectures, Arrangements, and Applications Conference Proceedings, SPA, 2018-September, 140–145.
- Blachut, K., Kryjak, T., & Gorgon, M. (2018): Hardware implementation of multi-scale Lucas-Kanade optical flow computation algorithm – a demo. Conference on Design and Architectures for Signal and Image Processing, DASIP.
- Przewlocka, D., & Kryjak, T. (2018): Hardware acceleration of face detection using a deep convolutional neural network – a demo. Conference on Design and Architectures for Signal and Image Processing,
- Fraczek, P., Mora, A., & Kryjak, T. (2018): Embedded vision system for automated drone landing site detection – a demo. Conference on Design and Architectures for Signal and Image Processing,
- Fraczek, P., Mora, A., & Kryjak, T. (2018): Embedded vision system for automated drone landing site detection. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11114 LNCS, 397–409.
External:
- 2020-2022 – „The development of solutions using modern vision sensors (event cameras) for applications related to autonomous, unmanned aerial vehicles” – an AGH project within the framework of Initiative for Excellence – Research University. Project leader: Marek Gorgoń, Contractors: EVS group
- 2017-2021 – „The development of computing resources organization in latest generation of heterogeneous reconfigurable devices enabling real-time processing of UHD/4K video stream” – grant no. 2016/23/D/ST6/01389, National Science Centre, Poland (Sonata 12). Project leader: Tomasz Kryjak, Contractors: Dominika Przewłocka-Rus, Marcin Kowalczyk, Piotr Janus, Krzysztof Błachut, Hubert Szolc, Mateusz Wąsala, Piotr Ciarach
- 2015-2016 – „A functional model of a machine with a vision system for scarification and evaluation acorns liveness based on automatic recognition of the topography of mummification changes” – National Center for Research and Development project, part of the Applied Research Program. Contractors: Marek Gorgoń, Tomasz Kryjak
- 2011-2013 – „Reconfigurable implementation of hardware modules for processing and analysis of complex video signals”, grant no. 2011/01/N/ST7/06687, National Science Centre Poland (Preludium). Project leader: Tomasz Kryjak
- 2011-2013 – „Intelligent surveillance system for space and objects of particular importance – SIMPOZ” – grant no. 0128/R/t00/2010/12, Ministry of Science and Higher Education of the Republic of Poland. Contractors: Marek Gorgoń, Tomasz Kryjak
Projects for young scientists:
- 2021 – „Traffic sign classification using spiking neural networks for event and classical camera data”, Project leader: Dominika Przewłocka-Rus
- 2021 – „Hardware acceleration of classical UAV control algorithms for autonomous drone racing”, Project leader: Hubert Szolc
- 2021 – „Deep neural network acceleration for car detection in LiDAR point cloud on a heterogeneous platform”, Project leader: Joanna Stanisz
- 2021 – „Detection of vehicles and their relative positions from the perspective of an unmanned aerial vehicle for the evaluation of autonomous vehicle perception systems”, Project leader: Krzysztof Błachut
- 2021 – „Implementation of spiking neural networks for environment perception in LiDAR point clouds”, Project leader: Konrad Lis
- 2021 – „Hardware acceleration of video object tracking algorithms based on correlation filters”, Project leader: Michał Daniłowicz
- 2021 – „Tracking fast moving objects with event and classical camera – a comparison”, Project leader: Marcin Kowalczyk
- 2021 – „Application of deep neural networks for the determination of position and orientation of an unmanned aerial vehicle in space – a comparison with classical methods”, Project leader: Mateusz Wąsala
- 2020 – „Object tracking using algorithms based on the correlation filter”, Project leader: Marcin Kowalczyk
- 2020 – „Object tracking using Siamese neural networks”, Project leader: Dominika Przewłocka-Rus
- 2020 – „Object tracking by detection using deep convolutional neural networks”, Project leader: Tomasz Kryjak
- 2019 – „Development of software and hardware architecture for the task of traffic sign detection and recognition using deep convolutional networks”, Project leader: Dominika Przewłocka
- 2019 – „Analysis and hardware-software implementation of obstacle detection algorithms for an autonomous vehicle”, Project leader: Marcin Kowalczyk
- 2019 – „Development of hardware and software architecture for the task of traffic lights detection and recognition for autonomous vehicles and driver assistance systems”, Project leader: Tomasz Kryjak
- 2018 – „Hardware implementation of algorithms for detection and classification of objects based on the analysis of data from the LIDAR sensor” – Faculty of Electrical Engineering, Automatics, Computer Science and Biomedical Engineering Dean grant. Project number: 15.11.120.712. Project leader: Tomasz Kryjak
- 2017 – „Hardware implementation of algorithms for detection and re-identification of persons as well as detection of abandoned luggage in an advanced, automatic video monitoring system” – Faculty of Electrical Engineering, Automatics, Computer Science and Biomedical Engineering Dean grant. Project number: 15.11.120.623. Project leader: Tomasz Kryjak
- 2016 – „Hardware implementation of selected objects segmentation algorithms in reconfigurable FPGAs, heterogeneous Zynq SoC and programmable GPGPUs” – Faculty of Electrical Engineering, Automatics, Computer Science and Biomedical Engineering Dean grant. Project number: 15.11.120.879. Project leader: Tomasz Kryjak
- 2015 – „The use of heterogeneous computing platforms in object tracking task for video processing systems” – Faculty of Electrical Engineering, Automatics, Computer Science and Biomedical Engineering Dean grant. Project number: 15.11.120.476. Project leader: Tomasz Kryjak
- 2014 – „The use of heterogeneous computing platforms in object classification task for video processing systems” – Faculty of Electrical Engineering, Automatics, Computer Science and Biomedical Engineering Dean grant, 2014. Project number: 15.11.120.406. Project leader: Tomasz Kryjak
- 2013 – „Heterogeneous computing systems evaluation in object detection and recognition” – Faculty of Electrical Engineering, Automatics, Computer Science and Biomedical Engineering Dean grant. Project number: 15.11.120.330. Project leader: Tomasz Kryjak
- 2012 – „The use of 3D information and thermal imaging in advanced video surveillance systems” – Faculty of Electrical Engineering, Automatics, Computer Science and Biomedical Engineering Dean grant. Project number: 15.11.120.231. Project leader: Tomasz Kryjak
2019.03.31 – 2019.10.31 – Project: Analysis of the possibilities of using SLAM technology based on visual information for positioning of an autonomous vehicle. Company ABB Ltd, Warszawa/Kraków, Poland. Project leader: Tomasz Kryjak, Contractors: Marcin Kowalczyk, Krzysztof Błachut, Hubert Szolc, Mateusz Wąsala
2018.11.15 – 2019.04.30 – Projects: Analysis of the possibilities of automation of the process of reading cassette tests results and ELISA tests (holes plate tests). Implementation of a C++ application using the OpenCV library version 4.X that automates the process of reading the results of cassette tests (part A) and ELISA tests (plate tests) (part B). Company: ABERIT, Rzeszów, Poland. Project leader: Tomasz Kryjak. Contractor: Domnika Przewłocka
2017.09.11 – 2018.08.08 – Project: Model of the Witrak videotracker (in FPGA). Company PCO Ltd. Warszawa, Poland. Project leader: Marek Gorgoń. Contractors: Tomasz Kryjak, Marcin Kowalczyk