{"id":72,"date":"2020-03-15T01:24:36","date_gmt":"2020-03-14T22:24:36","guid":{"rendered":"https:\/\/kovan.ceng.metu.edu.tr\/portfolio\/"},"modified":"2021-04-21T08:49:20","modified_gmt":"2021-04-21T05:49:20","slug":"portfolio","status":"publish","type":"page","link":"https:\/\/kovan.ceng.metu.edu.tr\/?page_id=72","title":{"rendered":"Research"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-page\" data-elementor-id=\"72\" class=\"elementor elementor-72\">\n\t\t\t\t\t\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-df41f81 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"df41f81\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"aux-parallax-section elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-c91660e\" data-id=\"c91660e\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t\t\t<div class=\"elementor-element elementor-element-fc7795d elementor-tabs-view-horizontal elementor-widget elementor-widget-tabs\" data-id=\"fc7795d\" data-element_type=\"widget\" data-widget_type=\"tabs.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<div class=\"elementor-tabs\">\n\t\t\t<div class=\"elementor-tabs-wrapper\" role=\"tablist\" >\n\t\t\t\t\t\t\t\t\t<div id=\"elementor-tab-title-2641\" class=\"elementor-tab-title elementor-tab-desktop-title\" aria-selected=\"true\" data-tab=\"1\" role=\"tab\" tabindex=\"0\" aria-controls=\"elementor-tab-content-2641\" aria-expanded=\"false\">Projects<\/div>\n\t\t\t\t\t\t\t\t\t<div id=\"elementor-tab-title-2642\" class=\"elementor-tab-title elementor-tab-desktop-title\" aria-selected=\"false\" data-tab=\"2\" role=\"tab\" tabindex=\"-1\" aria-controls=\"elementor-tab-content-2642\" aria-expanded=\"false\">Topics<\/div>\n\t\t\t\t\t\t\t\t\t<div id=\"elementor-tab-title-2643\" class=\"elementor-tab-title elementor-tab-desktop-title\" aria-selected=\"false\" data-tab=\"3\" role=\"tab\" tabindex=\"-1\" aria-controls=\"elementor-tab-content-2643\" aria-expanded=\"false\">Robots<\/div>\n\t\t\t\t\t\t\t<\/div>\n\t\t\t<div class=\"elementor-tabs-content-wrapper\" role=\"tablist\" aria-orientation=\"vertical\">\n\t\t\t\t\t\t\t\t\t<div class=\"elementor-tab-title elementor-tab-mobile-title\" aria-selected=\"true\" data-tab=\"1\" role=\"tab\" tabindex=\"0\" aria-controls=\"elementor-tab-content-2641\" aria-expanded=\"false\">Projects<\/div>\n\t\t\t\t\t<div id=\"elementor-tab-content-2641\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"1\" role=\"tabpanel\" aria-labelledby=\"elementor-tab-title-2641\" tabindex=\"0\" hidden=\"false\"><h2><span class=\"mw-headline\">Border\u00a0<\/span>Ownership Project (TUBITAK)<\/h2>\n<h2><a href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/BorderOwnership\"><img decoding=\"async\" class=\"alignnone wp-image-464 size-medium\" style=\"font-size: 2.375em;\" src=\"https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/427px-Project-Border-Ownership-300x150.png\" alt=\"\" width=\"300\" height=\"150\" \/><\/a><\/h2>\n<p>In this project, we have three goals: (1) Investigate the mechanisms important for determining border ownership. (2) Use and interpret the results of the investigation in item (1) to develop a computational model that would estimate the border ownership of the edges in the images. (3) Apply the developed computational model to important vision problems to demonstrate that using border ownership improves acquisition of reliable and complete visual information.<\/p>\n<p>Click <a title=\"BorderOwnership\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/BorderOwnership\" data-wplink-edit=\"true\">here<\/a>\u00a0for more information.<\/p>\n<h2><span class=\"mw-headline\">Development of Hierarchical Concepts in Humanoid Robots Project (TUBITAK)<\/span><\/h2>\n<p><a id=\"Development_of_Hierarchical_Concepts_in_Humanoid_Robots_Project_.28TUBITAK.29\" name=\"Development_of_Hierarchical_Concepts_in_Humanoid_Robots_Project_.28TUBITAK.29\"><\/a><a href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/BorderOwnership\"><img decoding=\"async\" class=\"alignnone wp-image-465 size-thumbnail\" src=\"https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/Project-MultilevelConceptualization-150x150.png\" alt=\"\" width=\"150\" height=\"150\" srcset=\"https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/Project-MultilevelConceptualization-150x150.png 150w, https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/Project-MultilevelConceptualization-300x300.png 300w\" sizes=\"(max-width: 150px) 100vw, 150px\" \/><\/a><\/p>\n<p>In this project, we will study how a (cognitively) developing and embodied humanoid robot can acquire a hierarchical representaion of concepts from its experiences. For this goal, by going beyond the current literature, we will use language, appearance as well as affordances of objects and investigate how these three modalities can affect the formation of the hierarchy. The proposed methods and mechanisms will be demonstrated on a concise scenario involving a humanoid robot iCub interact with a human on a clearly defined task.<\/p>\n<p>Click\u00a0<a title=\"MultilevelConceptualization\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/MultilevelConceptualization\" data-wplink-edit=\"true\">here<\/a>\u00a0for more information.<a id=\"EU_ROSSI_Project\" name=\"EU_ROSSI_Project\"><\/a><\/p>\n<h2><span class=\"mw-headline\">EU ROSSI Project<\/span><\/h2>\n<div class=\"floatright\"><a title=\"ROSSI\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/ROSSI\"><img decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/9\/90\/RemovA.png\" alt=\"\" width=\"99\" height=\"80\" border=\"0\" \/><\/a><\/div>\n<p>ROSSI (Emergence of Communication in Robots through Sensorimotor and Social Interaction) is an FP7 EU Project (FP7-216125) which started on 1st of February, 2008 and will run for three years. ROSSI is interested in the following two goals:<\/p>\n<ul>\n<li>to investigate the role of canonical and mirror neurons in development of concepts and language,<\/li>\n<\/ul>\n<ul>\n<li>to develop novel methods to make artificial agents (for example, robots) acquire concepts and abilities to explore and communicate about the environment.<\/li>\n<\/ul>\n<p>For more details, click\u00a0<a title=\"ROSSI\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/ROSSI\">here<\/a>.<a id=\"EU_MACS_Project\" name=\"EU_MACS_Project\"><\/a><\/p>\n<h2><span class=\"mw-headline\">EU MACS Project<\/span><\/h2>\n<div class=\"floatleft\"><a title=\"MACS\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/MACS\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/3\/3a\/Macs-logo.jpg\" alt=\"\" width=\"105\" height=\"35\" border=\"0\" \/><\/a><\/div>\n<p>How can concepts from Cognitive Systems be applied to Autonomous Robots? This research track is being supported by the MACS project, a StREP project accepted by the EC (European Commission) in FP6 within the Cognitive Systems strategic objective call of IST.<\/p>\n<p>For more details, click\u00a0<a title=\"MACS\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/MACS\">here<\/a>.<a id=\"CoSwarm\" name=\"CoSwarm\"><\/a><\/p>\n<h2><span class=\"mw-headline\">CoSwarm<\/span><\/h2>\n<div class=\"thumb tright\">\n<div class=\"thumbinner\">\n<p><a class=\"image\" title=\"Controllable-swarm-logo-kucuk2.jpg\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:Controllable-swarm-logo-kucuk2.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"thumbimage\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/thumb\/6\/69\/Controllable-swarm-logo-kucuk2.jpg\/120px-Controllable-swarm-logo-kucuk2.jpg\" alt=\"\" width=\"120\" height=\"94\" border=\"0\" \/><\/a><\/p>\n<div class=\"thumbcaption\">\n<div class=\"magnify\">\u00a0<\/div>\n<\/div>\n<\/div>\n<\/div>\n<p>The main scientific objective of this project is to investigate how and to what extend the dynamics of a robotic swarm can be externally controlled. In the project, a heterogeneous swarm, consisting of two types of mobile robots, one in large numbers but simple, the other in fewer numbers but more complex, will be developed. The experiments to be made with the real robots will be complemented by systematic experiments carried out in physically realistic simulation models that will also be developed. Click\u00a0<a title=\"CoSwarm\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/CoSwarm\">here<\/a>\u00a0for more information.<a id=\"Embedded_Linux_for_CoSwarm\" name=\"Embedded_Linux_for_CoSwarm\"><\/a><\/p>\n<h2><span class=\"mw-headline\">Embedded Linux for CoSwarm<\/span><\/h2>\n<p><a class=\"image\" title=\"Img1.png\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:Img1.png\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/4\/4b\/Img1.png\" alt=\"\" width=\"200\" height=\"150\" border=\"0\" \/><\/a><\/p>\n<p>The development of KOBOT&#8217;s imaging subsystem and is intended to expand as the project progresses. Click\u00a0<a title=\"Embedded Linux\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/Embedded_Linux\">here<\/a>\u00a0for more information.<a id=\"From_Ants_to_Robots\" name=\"From_Ants_to_Robots\"><\/a><\/p>\n<h2><span class=\"mw-headline\">From Ants to Robots<\/span><a id=\"Assessment_of_space\" name=\"Assessment_of_space\"><\/a><\/h2>\n<h3><span class=\"mw-headline\">Assessment of space<\/span><\/h3>\n<p><a class=\"image\" title=\"Assesment.jpg\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:Assesment.jpg\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/e\/e5\/Assesment.jpg\" alt=\"\" width=\"200\" height=\"147\" border=\"0\" \/><\/a><\/p>\n<p>How do ants assess the size and integrity of a closed space with limited perceptual sensing, and how can their methods be applied to mobile robots? Click\u00a0<a title=\"Projects\/From Ants to Robots\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/Projects\/From_Ants_to_Robots\">here<\/a>\u00a0for more information.<a id=\"Previous_Research_Topics\" name=\"Previous_Research_Topics\"><\/a><\/p>\n<h2><span class=\"mw-headline\">Previous Research Topics<\/span><a id=\"Swarm_Bots\" name=\"Swarm_Bots\"><\/a><\/h2>\n<h3><span class=\"mw-headline\">Swarm Bots<\/span><\/h3>\n<p><a class=\"image\" title=\"Image:swarm-bots-logo.gif\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:Swarm-bots-logo.gif\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/4\/42\/Swarm-bots-logo.gif\" alt=\"Image:swarm-bots-logo.gif\" width=\"292\" height=\"108\" border=\"0\" \/><\/a><\/p>\n<p>Sub-contract to IRIDIA for the\u00a0<a class=\"external text\" title=\"http:\/\/www.swarm-bots.org\/\" href=\"http:\/\/www.swarm-bots.org\/\" rel=\"nofollow\">Swarm-bots project<\/a>.<a id=\"Sanal_Robot_Kolonisi\" name=\"Sanal_Robot_Kolonisi\"><\/a><\/p>\n<h3><span class=\"mw-headline\">Sanal Robot Kolonisi<\/span><\/h3>\n<p>&#8220;Sanal Robot Kolonisi&#8221; project supported by BAP.<a id=\"Kendi_Kendine_.C3.96rg.C3.BCtlenebilir_Robot_O.C4.9Fulu\" name=\"Kendi_Kendine_.C3.96rg.C3.BCtlenebilir_Robot_O.C4.9Fulu\"><\/a><\/p>\n<h3><span class=\"mw-headline\">Kendi Kendine \u00d6rg\u00fctlenebilir Robot O\u011fulu<\/span><\/h3>\n<p>&#8220;Kendi Kendine Orgutlenebilir Robot Ogulu&#8221;, an inter-disciplinary BAP project run together with Dr. Bugra Koku from Dept. of Mechanical Eng. of METU.<a id=\"Caligrapher_Robot_Project\" name=\"Caligrapher_Robot_Project\"><\/a><\/p>\n<h3><span class=\"mw-headline\">Caligrapher Robot Project<\/span><\/h3>\n<p><a class=\"image\" title=\"Calligrapher.jpg\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:Calligrapher.jpg\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/f\/ff\/Calligrapher.jpg\" alt=\"\" width=\"200\" height=\"200\" border=\"0\" \/><\/a><\/p>\n<p>This project aimed to design a robot controlled by a remote agent through bluetooth connection. A person holding a PDA would be able to control it by moving the stylus. The robot would be capable of holonomic motion, and mimic the path that the stylus draws on the screen. Click\u00a0<a class=\"external text\" title=\"http:\/\/www.kovan.ceng.metu.edu.tr\/pub\/html\/calligrapher_robot.html\" href=\"http:\/\/www.kovan.ceng.metu.edu.tr\/pub\/html\/calligrapher_robot.html\" rel=\"nofollow\">here<\/a>\u00a0for more information.<a id=\"Dancing_Robots_Project\" name=\"Dancing_Robots_Project\"><\/a><\/p>\n<h3><span class=\"mw-headline\">Dancing Robots Project<\/span><\/h3>\n<p><a class=\"image\" title=\"Mindstorm.jpg\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:Mindstorm.jpg\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/4\/4d\/Mindstorm.jpg\" alt=\"\" width=\"200\" height=\"115\" border=\"0\" \/><\/a><\/p>\n<p>The goal of the project was to use PDA&#8217;s with the LEGO Mindstorms kits. Through this we were not able to expand the programming, and display capabilities of the Mindstorms but also were able to use the IrDA (infrared) communication ability of the PDA&#8217;s as a medium of communication between the robots. Click\u00a0<a class=\"external text\" title=\"http:\/\/kovan.ceng.metu.edu.tr\/pub\/html\/dancing_robots_english.html\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/pub\/html\/dancing_robots_english.html\" rel=\"nofollow\">here<\/a>\u00a0for more information.<a id=\"Instincts_for_Guiding_and_Energizing_Learning\" name=\"Instincts_for_Guiding_and_Energizing_Learning\"><\/a><\/p>\n<h3><span class=\"mw-headline\">Instincts for Guiding and Energizing Learning<\/span><\/h3>\n<p><a class=\"image\" title=\"Pav.gif\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:Pav.gif\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/3\/33\/Pav.gif\" alt=\"\" width=\"200\" height=\"113\" border=\"0\" \/><\/a><\/p>\n<p>How can learning be scaled for autonomous systems. Where to draw the line between evolution and learning for adaptation?<a id=\"3D_Colored_Range_Image_Construction\" name=\"3D_Colored_Range_Image_Construction\"><\/a><\/p>\n<h3><span class=\"mw-headline\">3D Colored Range Image Construction<\/span><\/h3>\n<div class=\"floatright\"><a class=\"image\" title=\"3d-colored-range.png\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:3d-colored-range.png\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/8\/86\/3d-colored-range.png\" alt=\"\" width=\"200\" height=\"149\" border=\"0\" \/><\/a><\/div>\n<p>The Kurt3D robot acquires distance information of the points in its scan range by using the laser scanner. Since distance information is acquired due to the scan rays, only the distance of the points that are on direction of the scan rays can be calculated. The laser scanner does not give any color information about these points. Color information of these points can be obtained from two color cameras of the Kurt3D robot, if the points are in one of the camera views.<\/p>\n<p>The objective of my project is to fuse the range and camera images obtained by the Kurt3D robot onto a single colored range image. While doing 3D modeling of the robot environment on a single image, we referred distance and color information acquired from the laser scanner and the cameras. Click\u00a0<a class=\"external text\" title=\"http:\/\/kovan.ceng.metu.edu.tr\/pub\/html\/color_range.html\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/pub\/html\/color_range.html\" rel=\"nofollow\">here<\/a>\u00a0for more information.<\/p><\/div>\n\t\t\t\t\t\t\t\t\t<div class=\"elementor-tab-title elementor-tab-mobile-title\" aria-selected=\"false\" data-tab=\"2\" role=\"tab\" tabindex=\"-1\" aria-controls=\"elementor-tab-content-2642\" aria-expanded=\"false\">Topics<\/div>\n\t\t\t\t\t<div id=\"elementor-tab-content-2642\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"2\" role=\"tabpanel\" aria-labelledby=\"elementor-tab-title-2642\" tabindex=\"0\" hidden=\"hidden\"><h2><span class=\"mw-headline\">Cognitive Robotics<\/span><\/h2><p>Through the EU funded\u00a0<a title=\"ROSSI\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/ROSSI\">ROSSI<\/a>\u00a0and\u00a0<a title=\"MACS\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/MACS\">MACS<\/a>\u00a0projects, we have become interested in developing artificial agents (for example, robots) that self-develop through self-motivated interactions with the environment and interact with other agents (for example, human beings).<a id=\"Swarm_Intelligence\" name=\"Swarm_Intelligence\"><\/a><\/p><h2><span class=\"mw-headline\">Swarm Intelligence<\/span><\/h2><p>It is interesting to see how small animals, insects in nature with limited motor and sensory capabilities can perform complex behaviors by communicating and cooperating with other members of their colonies. Motivated by such colonies in nature, we investigate wow we can control a swarm of robots to self-organize and self-assemble to accomplish global tasks and how evolutionary methods can be used for this purpose.<\/p><p>See the relevant projects\u00a0<a title=\"CoSwarm\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/CoSwarm\">CoSwarm<\/a>\u00a0and\u00a0<a title=\"Projects\/From Ants to Robots\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/Projects\/From_Ants_to_Robots\">From Ants to Robots<\/a>.<a id=\"Cognitive_and_Computer_Vision\" name=\"Cognitive_and_Computer_Vision\"><\/a><\/p><h2><span class=\"mw-headline\">Cognitive and Computer Vision<\/span><\/h2><p>Visual perception is an important ingredient for building cognitive robots. However, we are mostly interested in making use of visual systems that are biologically plausible. Especially, we would like to build methodologies that would allow an artificial agent, who has only simple and inaccurate visual processes, to develop its visual capabilities through experience.<\/p><\/div>\n\t\t\t\t\t\t\t\t\t<div class=\"elementor-tab-title elementor-tab-mobile-title\" aria-selected=\"false\" data-tab=\"3\" role=\"tab\" tabindex=\"-1\" aria-controls=\"elementor-tab-content-2643\" aria-expanded=\"false\">Robots<\/div>\n\t\t\t\t\t<div id=\"elementor-tab-content-2643\" class=\"elementor-tab-content elementor-clearfix\" data-tab=\"3\" role=\"tabpanel\" aria-labelledby=\"elementor-tab-title-2643\" tabindex=\"0\" hidden=\"hidden\"><p>Below are the robots and their descriptions.<\/p><h2><span class=\"mw-headline\"><b>Nao<\/b><\/span><\/h2><p><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-475 size-thumbnail alignnone\" src=\"https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/Nao1-150x150.jpg\" alt=\"\" width=\"150\" height=\"150\" srcset=\"https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/Nao1-150x150.jpg 150w, https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/Nao1-300x300.jpg 300w, https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/Nao1.jpg 400w\" sizes=\"(max-width: 150px) 100vw, 150px\" \/><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-476 size-thumbnail alignnone\" src=\"https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/Nao2-150x150.jpg\" alt=\"\" width=\"150\" height=\"150\" srcset=\"https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/Nao2-150x150.jpg 150w, https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/Nao2-300x300.jpg 300w\" sizes=\"(max-width: 150px) 100vw, 150px\" \/><\/p><p>We have recently acquired a small friendly humanoid called\u00a0<a class=\"external text\" title=\"http:\/\/en.wikipedia.org\/wiki\/Nao_%28robot%29\" href=\"http:\/\/en.wikipedia.org\/wiki\/Nao_%28robot%29\" rel=\"nofollow\">Nao<\/a>.<\/p><h2><span class=\"mw-headline\"><b>iCub<\/b><\/span><\/h2><p><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-477 size-thumbnail alignleft\" src=\"https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/ICub-birth-17-150x150.jpg\" alt=\"\" width=\"150\" height=\"150\" srcset=\"https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/ICub-birth-17-150x150.jpg 150w, https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/ICub-birth-17-300x300.jpg 300w\" sizes=\"(max-width: 150px) 100vw, 150px\" \/><\/p><p><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-478 size-thumbnail alignnone\" src=\"https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/ICub-work-2-150x150.jpg\" alt=\"\" width=\"150\" height=\"150\" srcset=\"https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/ICub-work-2-150x150.jpg 150w, https:\/\/kovan.ceng.metu.edu.tr\/wp-content\/uploads\/2021\/03\/ICub-work-2-300x300.jpg 300w\" sizes=\"(max-width: 150px) 100vw, 150px\" \/><\/p><p>\u00a0<\/p><p>Click <a title=\"ICub gallery\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/ICub_gallery\">here<\/a>\u00a0for some pictures of iCub.<\/p><p>iCub is a humanoid robot that we received due to our successful bid towards getting a humanoid robot platform from the RobotCub project,\u00a0<a class=\"external text\" title=\"http:\/\/eris.liralab.it\/wiki\/RobotCub_Open_Call\" href=\"http:\/\/eris.liralab.it\/wiki\/RobotCub_Open_Call\" rel=\"nofollow\">ranking 6th among 31 proposals<\/a>! The project will be carried out in conjunction with the\u00a0<a class=\"external text\" title=\"http:\/\/www.rossiproject.eu\/\" href=\"http:\/\/www.rossiproject.eu\/\" rel=\"nofollow\">EU ROSSI project<\/a>\u00a0and in close collaboration with\u00a0<a class=\"external text\" title=\"http:\/\/personal.his.se\/person.asp?record_id=1930\" href=\"http:\/\/personal.his.se\/person.asp?record_id=1930\" rel=\"nofollow\">Tom Ziemke<\/a>\u00a0and\u00a0<a class=\"external text\" title=\"http:\/\/www.cns.atr.jp\/~erhan\/\" href=\"http:\/\/www.cns.atr.jp\/~erhan\/\" rel=\"nofollow\">Erhan Oztop<\/a>. Check out the project\u00a0<a class=\"external text\" title=\"http:\/\/eris.liralab.it\/wiki\/Emergence_of_Communication_in_iCub_through_Sensorimotor_and_Social_Interaction\" href=\"http:\/\/eris.liralab.it\/wiki\/Emergence_of_Communication_in_iCub_through_Sensorimotor_and_Social_Interaction\" rel=\"nofollow\">web page<\/a>.<\/p><p>With iCub, we would like to investigate the development of concepts and communication by interacting with the environment.<\/p><p>iCub hakk\u0131nda T\u00fcrk\u00e7e bilgi i\u00e7in l\u00fctfen\u00a0<a title=\"ICub\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/ICub\">t\u0131klay\u0131n<\/a>.<\/p><h2><span class=\"mw-headline\"><b>Kurt3D<\/b><\/span><\/h2><p><a class=\"image\" title=\"Image:Kurt3D.jpg\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:Kurt3D.jpg\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/3\/37\/Kurt3D.jpg\" alt=\"Image:Kurt3D.jpg\" width=\"200\" height=\"150\" border=\"0\" \/><\/a><\/p><p>Kurt3D is an autonomous mobile robot equipped with a reliable and precise 3D laser scanner that digitalizes environments. High quality geometric 3D maps with semantic information are automatically generated after the exploration by the robot.<\/p><p>Kovan Lab has one Kurt3D robot used in our cognitive robotics projects.<\/p><h2><span class=\"mw-headline\"><b>Kobot<\/b><\/span><\/h2><div class=\"floatright\"><a class=\"image\" title=\"Kobot.jpg\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:Kobot.jpg\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/0\/0e\/Kobot.jpg\" alt=\"\" width=\"146\" height=\"243\" border=\"0\" \/><\/a><\/div><p>Controllable Swarm Robot: KOBOT<\/p><p>KOBOT has been designed by KOVAN research lab for use in swarm robotic studies. It features:<\/p><ul><li>Asymmetrical body having a circular shape with 120 mm diameter and 70 mm height<\/li><li>300 gr weight with standard robot configuration<\/li><li>differential steering system using two high-quality low-power DC gearhead motors<\/li><li>8 infrared sensors<\/li><li>HMC6352, digital compass module<\/li><li>4 hours of autonomy with Li Polymer batteries<\/li><li>PXA255 processor for processing images coming from the omnidirectional viewing system<\/li><li>Remote programming capability via IEEE 802.15.4\/ZigBee compliant XBEE OEM RF modules<\/li><li>3 bright LEDs and a buzzer on-board to indicate the robot&#8217;s internal state<\/li><\/ul><p>Click\u00a0<a title=\"Robots\\Kobot\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/Robots%5CKobot\">HERE<\/a>\u00a0for more information&#8230;<\/p><h2><span class=\"mw-headline\"><b>Khepera<\/b><\/span><\/h2><div class=\"floatleft\"><a class=\"image\" title=\"Khepera.jpg\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:Khepera.jpg\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/c\/c6\/Khepera.jpg\" alt=\"\" width=\"200\" height=\"148\" border=\"0\" \/><\/a><\/div><p>Khepera is a miniature mobile robot with functionality similar to that of larger robots used in research and education. Khepera was originally designed as a research and teaching tool for a Swiss Research Priority Program at EPFL in Lausanne. It allows real world testing of algorithms developed in simulation for trajectory planning, obstacle avoidance, pre-processing of sensory information, and hypotheses on behaviour processing, among others.<\/p><p>Very modular at both the software and hardware level, Khepera has a very efficient library of on-board applications for controlling the robot, monitoring experiments, and downloading new software. A large number of extension modules make it adaptable to a wide range of experimentation.<\/p><p>Kovan Lab has one Khepera robot.<\/p><h2><span class=\"mw-headline\"><b>Hemisson<\/b><\/span><\/h2><div class=\"floatleft\"><a class=\"image\" title=\"Hemisson.jpg\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:Hemisson.jpg\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/3\/38\/Hemisson.jpg\" alt=\"\" width=\"180\" height=\"173\" border=\"0\" \/><\/a><\/div><p>Hemisson is a nice tool for teachers or hobbyists created by Swiss company K-Team, Hemisson is a small mobile robot.<\/p><p>Among others, it comes with two motors (to drive independantly both wheels), a great bunch of sensors (including eight light sensors) and a TV remote. It has three pre-installed programs (obstacle avoidance, line following and dance), and 2-hours autonomy.<\/p><p>It is a pre-mounted robot, suitable for teachers who can use it as a demonstration tool, as well as for hobbyists\u00a0: it can be programmed to change its behaviour, and its firmware can be changed too.<\/p><p>Kovan Lab has five Hemisson robots used in our Swarm-Robotics works.<\/p><h2><span class=\"mw-headline\"><b>LEGO Mindstorms<\/b><\/span><\/h2><div class=\"floatright\"><a class=\"image\" title=\"Legomindstorms.jpg\" href=\"http:\/\/kovan.ceng.metu.edu.tr\/index.php\/File:Legomindstorms.jpg\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/kovan.ceng.metu.edu.tr\/images\/b\/b8\/Legomindstorms.jpg\" alt=\"\" width=\"350\" height=\"202\" border=\"0\" \/><\/a><\/div><p>The LEGO MINDSTORMS Robotics Invention System (RIS) is a kit for building robots. It consists of a programmable brick (with a Hitachi H8\/3292 microprocessor) named the RCX, and a lot of other traditional LEGO building parts named TECHNICS.<\/p><p>The RCX has 3 input ports, 3 output ports, and an infrared transmitter\/receiver. You can connect the inputs to sensors so that the RCX is aware of what is happening outside. There are many kinds of sensors that LEGO offers but only two kinds come with the RIS package. You must purchase the other ones separately if you want to use them. The sensors that come with the package are two bumpers (touch sensors), and a light sensor. The 3 outputs can be connected to motors, so that your robot can move.<\/p><p>Our lab has 5 LEGO Mindstorms Robotics Invention Systems(RIS).<\/p><\/div>\n\t\t\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Projects Topics Robots Projects Border\u00a0Ownership Project (TUBITAK) In this project, we have three goals: (1) Investigate the mechanisms important for determining border ownership. (2) Use and interpret the results of the investigation in item (1) to develop a computational model that would estimate the border ownership of the edges in the images. (3) Apply the [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"templates\/page-builder-content.php","meta":{"footnotes":""},"class_list":["post-72","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/kovan.ceng.metu.edu.tr\/index.php?rest_route=\/wp\/v2\/pages\/72","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/kovan.ceng.metu.edu.tr\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/kovan.ceng.metu.edu.tr\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/kovan.ceng.metu.edu.tr\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/kovan.ceng.metu.edu.tr\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=72"}],"version-history":[{"count":16,"href":"https:\/\/kovan.ceng.metu.edu.tr\/index.php?rest_route=\/wp\/v2\/pages\/72\/revisions"}],"predecessor-version":[{"id":557,"href":"https:\/\/kovan.ceng.metu.edu.tr\/index.php?rest_route=\/wp\/v2\/pages\/72\/revisions\/557"}],"wp:attachment":[{"href":"https:\/\/kovan.ceng.metu.edu.tr\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=72"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}