19 results on '"Andrey V. Kudryavtsev"'
Search Results
2. Full 3D rotation estimation in scanning electron microscope.
- Author
-
Andrey V. Kudryavtsev, Sounkalo Dembélé, and Nadine Piat
- Published
- 2017
- Full Text
- View/download PDF
3. Eye-in-Hand Visual Servoing of Concentric Tube Robots.
- Author
-
Andrey V. Kudryavtsev, Mohamed Taha Chikhaoui, Aleksandr Liadov, Patrick Rougeot, Fabien Spindler, Kanty Rabenorosoa, Jessica Burgner-Kahrs, Brahim Tamadazte, and Nicolas Andreff
- Published
- 2018
- Full Text
- View/download PDF
4. Stereovision-based control for automated MOEMS assembly.
- Author
-
Andrey V. Kudryavtsev, Guillaume J. Laurent, Cédric Clévy, Brahim Tamadazte, and Philippe Lutz
- Published
- 2015
- Full Text
- View/download PDF
5. Autocalibration method for scanning electron microscope using affine camera model.
- Author
-
Andrey V. Kudryavtsev, Valérian Guelpa, Patrick Rougeot, Olivier Lehmann, Sounkalo Dembélé, Peter Sturm 0002, and Nadine Le Fort-Piat
- Published
- 2020
- Full Text
- View/download PDF
6. Methods of Laboratory Studies of the Star Draining Tillage Working Body
- Author
-
Andrey V. Kudryavtsev and Filipp L. Blinov
- Subjects
Tillage ,Hydrology ,Environmental science ,Star (graph theory) - Abstract
At the moment, the issue of creating and positioning tools for deep loosening of the subsurface layer and the formation of aeration drainage has been studied in detail. The modern market offers many different tools for deep loosening or moling, but according to the analysis of machines and tools, as well as promising developments presented in the form of patents, we can say that there are no systems capable of performing deep loosening and drainage operations together. (Research purpose) The research purpose is in creating a fundamentally new star-shaped design of a draining tillage working body that will be able to form an optimal water-air regime of the soil due to the formation of maximum aeration drainage per unit area, with simultaneous loosening of the subsurface horizon, excluding over-compaction of the near-drainage zone as part of the soil-forming system. (Materials and methods) To simulate the operation process, predict and identify the optimal parameters and operating modes, as well as to bring the developed working body to field testing, it is necessary to conduct a number of laboratory experiments and studies. The article presents the scheme of the laboratory installation and described the principle of its operation. (Results and discussion) The article presents a full factor experiment with three factors and three levels of variation in three-fold repetition to assess the influence of existing factors on the operation of the star-shaped drainage working body, the quality of the treatment of the subsurface horizon and the formation of drains, the interaction of factors with each other, as well as creation a mathematical model of the process of combined tillage, including the processes of deep loosening and aeration drainage. (Conclusions) The proposed method will allow us to more accurately and systematically identify the optimal parameters and modes of operation of the body, based on changes in the optimization parameters.
- Published
- 2021
- Full Text
- View/download PDF
7. EVALUATION OF TECHNICAL AND ECONOMIC EFFICIENCY WITHIN THE IMPLEMENTATION OF INNOVATIVE METHODS AND MEANS FOR THE FIGHT AGAINST SOSNOVSKY
- Author
-
Alexey A. Smirnov, Andrey V. Kudryavtsev, and Vycheslav V. Golubev
- Subjects
Economic efficiency ,Business ,Environmental economics - Published
- 2020
- Full Text
- View/download PDF
8. Automatic Tip-Steering of Concentric Tube Robots in the Trachea Based on Visual SLAM
- Author
-
Brahim Tamadazte, Pierre Renaud, Kanty Rabenorosoa, Cedric Girerd, Patrick Rougeot, Andrey V. Kudryavtsev, Institut des Systèmes Intelligents et de Robotique (ISIR), Sorbonne Université (SU)-Centre National de la Recherche Scientifique (CNRS), University of California (UCLA), Franche-Comté Électronique Mécanique, Thermique et Optique - Sciences et Technologies (UMR 6174) (FEMTO-ST), Université de Technologie de Belfort-Montbeliard (UTBM)-Ecole Nationale Supérieure de Mécanique et des Microtechniques (ENSMM)-Université de Franche-Comté (UFC), Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC)-Centre National de la Recherche Scientifique (CNRS), Laboratoire des sciences de l'ingénieur, de l'informatique et de l'imagerie (ICube), Institut National des Sciences Appliquées - Strasbourg (INSA Strasbourg), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Strasbourg (UNISTRA)-Centre National de la Recherche Scientifique (CNRS)-École Nationale du Génie de l'Eau et de l'Environnement de Strasbourg (ENGEES)-Réseau nanophotonique et optique, Centre National de la Recherche Scientifique (CNRS)-Université de Strasbourg (UNISTRA)-Université de Haute-Alsace (UHA) Mulhouse - Colmar (Université de Haute-Alsace (UHA))-Centre National de la Recherche Scientifique (CNRS)-Université de Strasbourg (UNISTRA)-Université de Haute-Alsace (UHA) Mulhouse - Colmar (Université de Haute-Alsace (UHA))-Matériaux et nanosciences d'Alsace (FMNGE), Institut de Chimie du CNRS (INC)-Université de Strasbourg (UNISTRA)-Université de Haute-Alsace (UHA) Mulhouse - Colmar (Université de Haute-Alsace (UHA))-Institut National de la Santé et de la Recherche Médicale (INSERM)-Centre National de la Recherche Scientifique (CNRS)-Institut de Chimie du CNRS (INC)-Université de Strasbourg (UNISTRA)-Institut National de la Santé et de la Recherche Médicale (INSERM)-Centre National de la Recherche Scientifique (CNRS), Université de Technologie de Belfort-Montbeliard (UTBM)-Ecole Nationale Supérieure de Mécanique et des Microtechniques (ENSMM)-Centre National de la Recherche Scientifique (CNRS)-Université de Franche-Comté (UFC), Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC), École Nationale du Génie de l'Eau et de l'Environnement de Strasbourg (ENGEES)-Université de Strasbourg (UNISTRA)-Institut National des Sciences Appliquées - Strasbourg (INSA Strasbourg), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Institut National de Recherche en Informatique et en Automatique (Inria)-Les Hôpitaux Universitaires de Strasbourg (HUS)-Centre National de la Recherche Scientifique (CNRS)-Matériaux et Nanosciences Grand-Est (MNGE), Université de Strasbourg (UNISTRA)-Université de Haute-Alsace (UHA) Mulhouse - Colmar (Université de Haute-Alsace (UHA))-Institut National de la Santé et de la Recherche Médicale (INSERM)-Institut de Chimie du CNRS (INC)-Centre National de la Recherche Scientifique (CNRS)-Université de Strasbourg (UNISTRA)-Université de Haute-Alsace (UHA) Mulhouse - Colmar (Université de Haute-Alsace (UHA))-Institut National de la Santé et de la Recherche Médicale (INSERM)-Institut de Chimie du CNRS (INC)-Centre National de la Recherche Scientifique (CNRS)-Réseau nanophotonique et optique, and Université de Strasbourg (UNISTRA)-Université de Haute-Alsace (UHA) Mulhouse - Colmar (Université de Haute-Alsace (UHA))-Centre National de la Recherche Scientifique (CNRS)-Université de Strasbourg (UNISTRA)-Centre National de la Recherche Scientifique (CNRS)
- Subjects
soft robotics ,0209 industrial biotechnology ,Computer science ,[INFO.INFO-DS]Computer Science [cs]/Data Structures and Algorithms [cs.DS] ,Soft robotics ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,medical robotics ,02 engineering and technology ,Concentric ,Simultaneous localization and mapping ,Visual servoing ,computer vision ,[SPI.AUTO]Engineering Sciences [physics]/Automatic ,[SPI]Engineering Sciences [physics] ,020901 industrial engineering & automation ,0202 electrical engineering, electronic engineering, information engineering ,Computer vision ,visual servoing ,business.industry ,Software deployment ,Robot ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,Focus (optics) ,Communication channel - Abstract
International audience
- Published
- 2020
- Full Text
- View/download PDF
9. Autocalibration method for scanning electron microscope using affine camera model
- Author
-
Nadine Le Fort-Piat, Peter Sturm, Valerian Guelpa, Olivier Lehmann, Sounkalo Dembélé, Andrey V. Kudryavtsev, Patrick Rougeot, Franche-Comté Électronique Mécanique, Thermique et Optique - Sciences et Technologies (UMR 6174) (FEMTO-ST), Université de Technologie de Belfort-Montbeliard (UTBM)-Ecole Nationale Supérieure de Mécanique et des Microtechniques (ENSMM)-Université de Franche-Comté (UFC), Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC)-Centre National de la Recherche Scientifique (CNRS), Sustainability transition, environment, economy and local policy (STEEP), Inria Grenoble - Rhône-Alpes, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-Laboratoire Jean Kuntzmann (LJK), Institut National de Recherche en Informatique et en Automatique (Inria)-Centre National de la Recherche Scientifique (CNRS)-Université Grenoble Alpes (UGA)-Institut polytechnique de Grenoble - Grenoble Institute of Technology (Grenoble INP ), Université Grenoble Alpes (UGA)-Centre National de la Recherche Scientifique (CNRS)-Université Grenoble Alpes (UGA)-Institut polytechnique de Grenoble - Grenoble Institute of Technology (Grenoble INP ), Université Grenoble Alpes (UGA), Université de Technologie de Belfort-Montbeliard (UTBM)-Ecole Nationale Supérieure de Mécanique et des Microtechniques (ENSMM)-Centre National de la Recherche Scientifique (CNRS)-Université de Franche-Comté (UFC), Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC), and ANR-17-EURE-0002,EIPHI,Ingénierie et Innovation par les sciences physiques, les savoir-faire technologiques et l'interdisciplinarité(2017)
- Subjects
Optimization problem ,Computer science ,[INFO.INFO-DS]Computer Science [cs]/Data Structures and Algorithms [cs.DS] ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Context (language use) ,02 engineering and technology ,Regularization (mathematics) ,[SPI.AUTO]Engineering Sciences [physics]/Automatic ,03 medical and health sciences ,Convergence (routing) ,0202 electrical engineering, electronic engineering, information engineering ,Calibration ,Computer vision ,030304 developmental biology ,0303 health sciences ,business.industry ,Computer Science Applications ,Hardware and Architecture ,Virtual image ,Metric (mathematics) ,020201 artificial intelligence & image processing ,Computer Vision and Pattern Recognition ,Affine transformation ,Artificial intelligence ,business ,Software - Abstract
International audience; This paper deals with the task of autocalibration of scanning electron microscope (SEM), which is a technique allowing to compute camera motion and intrinsic parameters. In contrast to classical calibration, which implies the use of a calibration object and is known to be a tedious and rigid operation, auto- or selfcalibration is performed directly on the images acquired for the visual task. As autocalibration represents an optimization problem, all the steps contributing to the success of the algorithm are presented: formulation of the cost function incorporating metric constraints, definition of bounds, regularization, and optimization algorithm. The presented method allows full estimation of camera matrices for all views in the sequence. It was validated on virtual images as well as on real SEM images (pollen grains, cutting tools, etc.). The results show a good convergence range and low execution time, notably compared to classical methods, and even more in the context of the calibration of SEM.
- Published
- 2020
- Full Text
- View/download PDF
10. Laser Beam Steering Along Three-Dimensional Paths
- Author
-
Brahim Tamadazte, Jean-Antoine Seon, Nicolas Andreff, Rupert Renevier, and Andrey V. Kudryavtsev
- Subjects
010302 applied physics ,Laser surgery ,0209 industrial biotechnology ,Observational error ,Computer science ,medicine.medical_treatment ,Inversion (meteorology) ,02 engineering and technology ,Welding ,Laser ,01 natural sciences ,Computer Science Applications ,law.invention ,Surface micromachining ,020901 industrial engineering & automation ,Control and Systems Engineering ,Control theory ,law ,Robustness (computer science) ,0103 physical sciences ,medicine ,Robot ,Electrical and Electronic Engineering - Abstract
This paper deals with the development of a vision-based control scheme for three-dimensional (3-D) laser steering. It proposes to incorporate a simplified trifocal constraint inside a path following scheme in order to ensure intuitively a 3-D control of laser spot displacements in unknown environment (target). The described controller is obtained without complex mathematical formulation nor matrix inversion as it requires only weak camera and hand-eye calibration. The developed control law was validated in both simulation and experimental conditions using various scenarios (e.g., static and deformable 3-D scenes, different control gains, initial velocities, etc.). The obtained results exhibit good accuracy and robustness with respect to the calibration and measurement errors and scene variations. In addition, with this kind of laser beam steering controller, it becomes possible to perfectly decouple the laser spot velocity from both the path shape and time. These features can fit several industrial applications (welding, micromachining, etc.) as well as surgical purposes (e.g., laser surgery) requirements.
- Published
- 2018
- Full Text
- View/download PDF
11. Accurate 3D-Positioning in a SEM through Robot Calibration
- Author
-
Valerian Guelpa, Sounkalo Dembélé, Nadine Piat, Andrey V. Kudryavtsev, Franche-Comté Électronique Mécanique, Thermique et Optique - Sciences et Technologies (UMR 6174) (FEMTO-ST), Université de Technologie de Belfort-Montbeliard (UTBM)-Ecole Nationale Supérieure de Mécanique et des Microtechniques (ENSMM)-Université de Franche-Comté (UFC), and Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC)-Centre National de la Recherche Scientifique (CNRS)
- Subjects
Autofocus ,Image formation ,Robot calibration ,Computer science ,business.industry ,3D reconstruction ,[INFO.INFO-DS]Computer Science [cs]/Data Structures and Algorithms [cs.DS] ,02 engineering and technology ,Kinematics ,021001 nanoscience & nanotechnology ,Visual servoing ,law.invention ,[SPI.AUTO]Engineering Sciences [physics]/Automatic ,law ,0202 electrical engineering, electronic engineering, information engineering ,Robot ,020201 artificial intelligence & image processing ,Computer vision ,Artificial intelligence ,0210 nano-technology ,business ,Robotic arm - Abstract
International audience; With growing trend of miniaturization, new challenges in microrobotics have appeared. In particular, the complexityof microworld comes from the fact that visual sensors such as Scanning Electron Microscope have a very different principles of image formation in contrast with classical cameras, and their field of view stay very limited. Moreover, usually, the kinematic model of the robots used are not well defined. The consequence of both properties is that even a small movement of the robot arm leads to a huge object displacement comparing to the size of the viewed area. This paper develops a procedure allowing to perform object rotation while keeping it at the same 3D-position in open loop. Such performance is achieved by a method of robot calibration based on visual servoing and autofocus inside SEM. This kind of properties is required for manipulation and 3D reconstruction inside SEM.
- Published
- 2018
- Full Text
- View/download PDF
12. Eye-in-Hand Visual Servoing of Concentric Tube Robots
- Author
-
Nicolas Andreff, Mohamed Taha Chikhaoui, Andrey V. Kudryavtsev, Jessica Burgner-Kahrs, Brahim Tamadazte, Patrick Rougeot, Kanty Rabenorosoa, Fabien Spindler, Aleksandr Liadov, Franche-Comté Électronique Mécanique, Thermique et Optique - Sciences et Technologies (UMR 6174) (FEMTO-ST), Université de Technologie de Belfort-Montbeliard (UTBM)-Ecole Nationale Supérieure de Mécanique et des Microtechniques (ENSMM)-Centre National de la Recherche Scientifique (CNRS)-Université de Franche-Comté (UFC), Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC), Leibniz Universität Hannover=Leibniz University Hannover, Sensor-based and interactive robotics (RAINBOW), Inria Rennes – Bretagne Atlantique, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-SIGNAUX ET IMAGES NUMÉRIQUES, ROBOTIQUE (IRISA-D5), Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-Institut National de Recherche en Informatique et en Automatique (Inria)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Rennes (UR)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Institut de Recherche en Informatique et Systèmes Aléatoires (IRISA), Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)-Université de Bretagne Sud (UBS)-École normale supérieure - Rennes (ENS Rennes)-CentraleSupélec-Centre National de la Recherche Scientifique (CNRS)-IMT Atlantique (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT), Université de Technologie de Belfort-Montbeliard (UTBM)-Ecole Nationale Supérieure de Mécanique et des Microtechniques (ENSMM)-Université de Franche-Comté (UFC), Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC)-Centre National de la Recherche Scientifique (CNRS), Leibniz Universität Hannover [Hannover] (LUH), Université de Bretagne Sud (UBS)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National de Recherche en Informatique et en Automatique (Inria)-École normale supérieure - Rennes (ENS Rennes)-Centre National de la Recherche Scientifique (CNRS)-Université de Rennes 1 (UR1), Université de Rennes (UNIV-RENNES)-CentraleSupélec-IMT Atlantique Bretagne-Pays de la Loire (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Université de Bretagne Sud (UBS)-Institut National des Sciences Appliquées - Rennes (INSA Rennes), and Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-Institut National des Sciences Appliquées (INSA)-Université de Rennes (UNIV-RENNES)-École normale supérieure - Rennes (ENS Rennes)-Centre National de la Recherche Scientifique (CNRS)-Université de Rennes 1 (UR1)
- Subjects
eye-in-hand visual servo- ing ,0209 industrial biotechnology ,Control and Optimization ,Computer science ,model-based control ,0206 medical engineering ,Biomedical Engineering ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,02 engineering and technology ,Kinematics ,Visual servoing ,020901 industrial engineering & automation ,Artificial Intelligence ,Control theory ,CTR kinematics ,[INFO.INFO-RB]Computer Science [cs]/Robotics [cs.RO] ,Computer vision ,business.industry ,Mechanical Engineering ,Torsion (mechanics) ,Optimal control ,020601 biomedical engineering ,Computer Science Applications ,Visualization ,Index Terms—concentric tube robot ,Human-Computer Interaction ,Nonlinear system ,medical application ,Control and Systems Engineering ,Control system ,Robot ,Computer Vision and Pattern Recognition ,Artificial intelligence ,business - Abstract
International audience; This paper deals with the development of a vision-based controller for a continuum robot architecture. More precisely, the controlled robotic structure is based on three-tube concentric tube robot (CTR), an emerging paradigm to design accurate, miniaturized, and flexible endoscopic robots. This approach has grown considerably in the recent years finding applications in numerous surgical disciplines. In contrast to conventional robotic structures, CTR kinematics arise many challenges for an optimal control such as friction, torsion, shear, and non-linear constitutive behavior. In fact, in order to ensure efficient and reliable control, in addition to computing an analytical and complete kinematic model, it is also important to close the control loop. To do this, we developed an eye-in-hand visual servoing scheme using a millimeter-sized camera embedded at the robot's tip. Both the kinematic model and the visual servoing controller were successfully validated in simulation with ViSP (Visual Servoing Platform) and using an experimental setup. The obtained results showed satisfactory performances for 3-degrees of freedom positioning and path following tasks with adaptive gain control.
- Published
- 2018
- Full Text
- View/download PDF
13. Full 3D Rotation Estimation in Scanning Electron Microscope
- Author
-
Nadine Piat, Andrey V. Kudryavtsev, Sounkalo Dembélé, Franche-Comté Électronique Mécanique, Thermique et Optique - Sciences et Technologies (UMR 6174) (FEMTO-ST), Université de Technologie de Belfort-Montbeliard (UTBM)-Ecole Nationale Supérieure de Mécanique et des Microtechniques (ENSMM)-Université de Franche-Comté (UFC), and Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC)-Centre National de la Recherche Scientifique (CNRS)
- Subjects
0209 industrial biotechnology ,Scanning electron microscope ,Computer science ,Parallel projection ,business.industry ,020208 electrical & electronic engineering ,3D reconstruction ,[INFO.INFO-DS]Computer Science [cs]/Data Structures and Algorithms [cs.DS] ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,02 engineering and technology ,Object (computer science) ,[SPI.AUTO]Engineering Sciences [physics]/Automatic ,Optical axis ,020901 industrial engineering & automation ,Orientation (geometry) ,0202 electrical engineering, electronic engineering, information engineering ,Computer vision ,Artificial intelligence ,Affine transformation ,business ,Pose ,Rotation (mathematics) - Abstract
International audience; Estimation of 3D object position is a crucial stepfor a variety of robotics and computer vision applicationsincluding 3D reconstruction and object manipulation. Whenworking in microscale, new types of visual sensors are used suchas Scanning Electron Microscope (SEM). Nowadays, microandnanomanipulation tasks, namely components assembly,are performed in teleoperated mode in most of the cases.Measuring object position and orientation is a crucial steptowards automatic object handling. Current methods of poseestimation in SEM allow recovering full object movement usingits computer-aided design (CAD) model. If the model is notknown, most methods allow to estimate only in-plane translationsand rotation around camera optical axis. In the literature,SEM is considered as a camera with parallel projection or anaffine camera, which means image invariance to z-movementand bas-relief ambiguity. In this paper, authors address theproblem of measuring full 3D rotation of the unknown scenefor uncalibrated SEM without additional sensors. Rotations areestimated from image triplets by solving a spherical trianglefrom fundamental matrices only, without need of intrinsiccalibration, allowing to avoid parallel projection ambiguities.The presented results, obtained in simulation and on real data,allow validating the proposed scheme.
- Published
- 2017
14. Stereo-image rectification for dense 3D reconstruction in scanning electron microscope
- Author
-
Andrey V. Kudryavtsev, Sounkalo Dembélé, Nadine Piat, Franche-Comté Électronique Mécanique, Thermique et Optique - Sciences et Technologies (UMR 6174) (FEMTO-ST), Université de Technologie de Belfort-Montbeliard (UTBM)-Ecole Nationale Supérieure de Mécanique et des Microtechniques (ENSMM)-Université de Franche-Comté (UFC), Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC)-Centre National de la Recherche Scientifique (CNRS), and Femto-st, AS2M
- Subjects
Image formation ,Parallel projection ,business.industry ,[INFO.INFO-DS]Computer Science [cs]/Data Structures and Algorithms [cs.DS] ,3D reconstruction ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,[INFO.INFO-DS] Computer Science [cs]/Data Structures and Algorithms [cs.DS] ,02 engineering and technology ,021001 nanoscience & nanotechnology ,Horizontal line test ,[SPI.AUTO]Engineering Sciences [physics]/Automatic ,Optical axis ,[SPI.AUTO] Engineering Sciences [physics]/Automatic ,Rectification ,Computer Science::Computer Vision and Pattern Recognition ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Computer vision ,Image rectification ,Artificial intelligence ,0210 nano-technology ,business ,Fundamental matrix (computer vision) ,Mathematics - Abstract
International audience; Stereo rectification is a crucial step for a numberof computer vision problems and in particular for dense 3Dreconstruction which is a very powerful characterization tool formicroscopic objects. Rectification simplifies and speeds up thecorrespondence search in a pair of images: the search space isreduced to a horizontal line. It is mainly developed for perspectivecamera model: a projective transformation is found and appliedto both images. This paper addresses the rectification problemfor an image pair obtained with Scanning Electron Microscope(SEM). In this case, image formation is described by a parallelprojection, indeed perspective effects can be neglected becauseof the low value of the sample size over working distanceratio. Based on these hypotheses, a robust estimation of thefundamental matrix, that describes the geometry of the imagepair, is proposed. It filters out up to 50% of outliers in afeature correspondence set. Then, the matrix is used to developa rectification solution: the problem is reduced to two rotationsabout the optical axis of the camera. The solution is accurate anddoes not require any calibration of SEM. It is validated with twoimage pairs from two different field effect SEMs
- Published
- 2017
- Full Text
- View/download PDF
15. Autofocus on moving object in scanning electron microscope
- Author
-
Sounkalo Dembélé, Andrey V. Kudryavtsev, and Nadine Piat
- Subjects
Autofocus ,0209 industrial biotechnology ,business.industry ,Computer science ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,02 engineering and technology ,021001 nanoscience & nanotechnology ,Object (computer science) ,Frame rate ,Atomic and Molecular Physics, and Optics ,Electronic, Optical and Magnetic Materials ,law.invention ,Visualization ,020901 industrial engineering & automation ,Optics ,law ,Robustness (computer science) ,Focal length ,Computer vision ,Artificial intelligence ,Noise (video) ,0210 nano-technology ,business ,Focus (optics) ,Instrumentation - Abstract
The sharpness of the images coming from a Scanning Electron Microscope (SEM) is a very important property for many computer vision applications at micro- and nanoscale. It represents how much object details are distinctive in the images: the object may be perceived sharp or blurred. Image sharpness highly depends on the value of focal distance, or working distance in the case of the SEM. Autofocus is the technique allowing to automatically adjust the working distance to maximize the sharpness. Most of the existing algorithms allows working only with a static object which is enough for the tasks of visualization, manual microanalysis or microcharacterization. These applications work with a low frame rate, less than 1 Hz, that guarantees a low level of noise. However, static autofocus can not be used for samples performing continuous 3D motion, which is the case of robotic applications where it is required to carry out a continuous 3D position measurement, e.g., nano-assembly or nanomanipulation. Moreover, in addition to constantly keeping object in focus while it is moving, it is required to perform the operation at high frame rate. The approach offering both these possibilities is presented in this paper and is referred as dynamic autofocus. The presented solution is based on stochastic optimization techniques. It allows tracking the maximum of the sharpness of the images without sweep and without training. It works under uncertainty conditions: presence of noise in images, unknown maximal sharpness and unknown 3D motion of the specimen. The experiments, that were performed with noisy images at high frame rate (5 Hz), were conducted on a Carl Zeiss Auriga 60 FE-SEM. They prove the robustness of the algorithm with respect to the variation of optimization parameters, object speed and magnification. Moreover, it is invariant to the object structure and its variation in time.
- Published
- 2017
16. Automated robotic microassembly of flexible optical components
- Author
-
Brahim Tamadazte, Philippe Lutz, Bilal Komati, Andrey V. Kudryavtsev, Joël Agnus, Guillaume J. Laurent, and Cédric Clévy
- Subjects
0209 industrial biotechnology ,Engineering ,business.industry ,media_common.quotation_subject ,Control engineering ,02 engineering and technology ,021001 nanoscience & nanotechnology ,Inertia ,020901 industrial engineering & automation ,Fully automated ,Grippers ,Eye tracking ,Computer vision ,Artificial intelligence ,0210 nano-technology ,business ,Position control ,Microscale chemistry ,media_common - Abstract
This paper studies the fabrication of hybrid mi-crocomponents through automated robotic microassembly. The robotic station used for the microassembly is presented in this paper and its use for the assembly of flexible optical microcomponents is done as a case study. Fully automated microassembly is done for better repeatability and accuracy of the tasks and to reduce the time cycle. For this reason, two complementary techniques are proposed and presented in this paper. The first technique consists of automated manipulation and insertion tasks using stereovision CAD-model based visual tracking. The second technique has been performed using hybrid force/position control and enable to perform grasping, guiding and releasing tasks in less than 1 s despite microscale specificities. These specificities are mainly manifested by the predominance of surface forces, the difficulty of integration sensors at this scale, the very small inertia of microcomponents and their high dynamics and the lack of precise models.
- Published
- 2016
- Full Text
- View/download PDF
17. Stereovision-based control for automated MOEMS assembly
- Author
-
Brahim Tamadazte, Philippe Lutz, Andrey V. Kudryavtsev, Cédric Clévy, and Guillaume J. Laurent
- Subjects
Microelectromechanical systems ,Engineering ,Robot kinematics ,business.industry ,Position (vector) ,Teleoperation ,Eye tracking ,Computer vision ,Artificial intelligence ,Visual servoing ,business ,Automation ,Microscale chemistry - Abstract
Microassembly represents a very promising solution to microproducts and complex Micro-Electro-Mechanical Systems (MEMS) fabrication. Since, in the case of teleoperated assembly, an operator is the main source of errors, there is a great interest in microassembly automation. Its main issue consists in precise estimation of object position. Previous studies demonstrate the possibility of application of modelbased visual tracking algorithms from ViSP (Visual Servoing Platform) library. However, the methods of macroassembly cannot be directly applied when working with microobjects. The characterization of single-view visual tracking notably revealed the complexity of depth estimation in microscale, which is due to small depth variation in the seen images compared with the distance from camera. So, an algorithm of Z coordinate reconstruction using a second camera was developed and analyzed for visual servoing task. It was then used to automate microassembly. Experiments demonstrate the possibility of complex microcomponent automatic microassembly with precision better than 10 micrometers.
- Published
- 2015
- Full Text
- View/download PDF
18. Analysis of CAD Model-based Visual Tracking for Microassembly using a New Block Set for MATLAB/Simulink
- Author
-
Brahim Tamadazte, Guillaume J. Laurent, Cédric Clévy, Andrey V. Kudryavtsev, Philippe Lutz, Franche-Comté Électronique Mécanique, Thermique et Optique - Sciences et Technologies (UMR 6174) (FEMTO-ST), Université de Technologie de Belfort-Montbeliard (UTBM)-Ecole Nationale Supérieure de Mécanique et des Microtechniques (ENSMM)-Université de Franche-Comté (UFC), and Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC)-Centre National de la Recherche Scientifique (CNRS)
- Subjects
0209 industrial biotechnology ,Computer science ,business.industry ,Mechanical Engineering ,Process (computing) ,CAD ,02 engineering and technology ,021001 nanoscience & nanotechnology ,Object (computer science) ,Automation ,[SPI.AUTO]Engineering Sciences [physics]/Automatic ,Set (abstract data type) ,020901 industrial engineering & automation ,Teleoperation ,Eye tracking ,[INFO.INFO-RB]Computer Science [cs]/Robotics [cs.RO] ,Computer vision ,Artificial intelligence ,Electrical and Electronic Engineering ,0210 nano-technology ,business ,Instrumentation ,Block (data storage) - Abstract
International audience; Microassembly is an innovative alternative to the microfabrication process of MOEMS which is quite complex. It usually implies the use of microrobots controlled by an operator. The reliability of this approach has been already confirmed for the micro-optical technologies. However, the characterization of assemblies has shown that the operator is the main source of inaccuracies in the teleoperated microassembly. Therefore, there is a great interest in automating the microassembly process. One of the constraints of automation in microscale is the lack of high precision sensors capable to provide the full information about the object position. Thus, the usage of visual-based feedback represents a very promising approach allowing to automate the microassembly process. The purpose of this paper is to characterize the techniques of object position estimation based on the visual data, i.e. visual tracking techniques from the ViSP library. These algorithms allows to get the 3D object pose using a single view of the scene and the CAD model of the object. The performance of three main types of model-based trackers is analyzed and quantified: edge-based, texture-based and hybrid tracker. The problems of visual tracking in microscale are discussed. The control of the micromanipulation station used in the framework of our project is performed using a new Simulink block set. Experimental results are shown and demonstrate thepossibility to obtain the repeatability below 1 micrometer.
- Published
- 2015
- Full Text
- View/download PDF
19. Characterization of Model-based Visual Tracking Techniques for MOEMS using a New Block Set for MATLAB/Simulink
- Author
-
Brahim Tamadazte, Cédric Clévy, Philippe Lutz, Andrey V. Kudryavtsev, Guillaume J. Laurent, Franche-Comté Électronique Mécanique, Thermique et Optique - Sciences et Technologies (UMR 6174) (FEMTO-ST), Université de Technologie de Belfort-Montbeliard (UTBM)-Ecole Nationale Supérieure de Mécanique et des Microtechniques (ENSMM)-Université de Franche-Comté (UFC), and Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC)-Centre National de la Recherche Scientifique (CNRS)
- Subjects
Robot kinematics ,Computer science ,business.industry ,Process (computing) ,Automation ,Visualization ,[SPI.AUTO]Engineering Sciences [physics]/Automatic ,Teleoperation ,Eye tracking ,Robot ,Computer vision ,Artificial intelligence ,business ,Block (data storage) - Abstract
International audience; Microassembly is an innovative alternative to the microfabrication process of MOEMS which is quite complex.It usually implies the usage of microrobots controlled by an operator. The reliability of this approach has been already confirmed for the micro-optical technologies. However, the characterization of assemblies has shown that the operator is the main source of inaccuracies in the teleoperated microassembly, so there is a great interest in automating the microassembly process. One of the constraints of automation in microscale is the lack of high precision sensors capable to provide the fullinformation about the object position. Thus, the usage of visualbased feedback represents a very promising approach allowing to automate the microassembly process. The purpose of this paper is to characterize the techniques of object position estimation based on the visual data, i.e. visual tracking techniques from the ViSP library. These algorithms allows to get the 3D object pose using a single view of the scene and the CAD model of the object. The performance of three main types of modelbased trackers is analyzed and quantified: edge-based, texturebased and hybrid tracker. The problems of visual tracking in microscale are discussed. The control of the micromanipulation station used in the framework of our project is performed using a new Simulink block set. Experimental results are shown and demonstrate the possibility to obtain the repeatability below 1µm.
- Published
- 2014
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.