10 skills found
openclassify / OpenclassifyOpenClassify is modular and advanced open source classified platform with Laravel 12 & PHP 8.5
gurtejrehal / FALCON AI Data CrawlerFalcon Search has been created to aid the National Crime Records Bureau keeping in mind the need for an efficient AI data crawler that collects classified data from the web based on given keywords. It is a SaaS web data integration (WDI) platform which converts unstructured web data into structured format by extracting, preparing and integrating web data in areas of crime for consumption in criminal investigation agencies. Falcon provides a visual environment for automating the workflow of extracting and transforming web data. After specifying the target website url, the web data extraction module provides a visual environment for designing automated workflows for harvesting data, going beyond HTML/XML parsing of static content to automate end user interactions yielding data that would otherwise not be immediately visible. Once extracted, the software provides full data preparation capabilities that are used for harmonizing and cleansing the web data. For consuming the results, Falcon provides several options. It has its own visualization and dashboarding module to help criminal investigators gain the insights that they need. It also provides APIs that offer full access to everything that can be done on our platform, allowing web data to be integrated directly. FALCON is capable of crawling ten million links and scrape one million links per month using Celery Worker. It moreover has the potential of outperforming this number if tested under standard cloud platforms.
Yogapriya2512 / A Simple Chatbot A chatbot (also known as a talkbot, chatterbot, Bot, IM bot, interactive agent, or Artificial Conversational Entity)The classic historic early chatbots are ELIZA (1966) and PARRY (1972).More recent notable programs include A.L.I.C.E., Jabberwacky and D.U.D.E (Agence Nationale de la Recherche and CNRS 2006). While ELIZA and PARRY were used exclusively to simulate typed conversation, many chatbots now include functional features such as games and web searching abilities. In 1984, a book called The Policeman's Beard is Half Constructed was published, allegedly written by the chatbot Racter (though the program as released would not have been capable of doing so). One pertinent field of AI research is natural language processing. Usually, weak AI fields employ specialized software or programming languages created specifically for the narrow function required. For example, A.L.I.C.E. uses a markup language called AIML, which is specific to its function as a conversational agent, and has since been adopted by various other developers of, so called, Alicebots. Nevertheless, A.L.I.C.E. is still purely based on pattern matching techniques without any reasoning capabilities, the same technique ELIZA was using back in 1966. This is not strong AI, which would require sapience and logical reasoning abilities. Jabberwacky learns new responses and context based on real-time user interactions, rather than being driven from a static database. Some more recent chatbots also combine real-time learning with evolutionary algorithms that optimise their ability to communicate based on each conversation held. Still, there is currently no general purpose conversational artificial intelligence, and some software developers focus on the practical aspect, information retrieval. Chatbot competitions focus on the Turing test or more specific goals. Two such annual contests are the Loebner Prize and The Chatterbox Challenge (offline since 2015, materials can still be found from web archives). According to Forrester (2015), AI will replace 16 percent of American jobs by the end of the decade.Chatbots have been used in applications such as customer service, sales and product education. However, a study conducted by Narrative Science in 2015 found that 80 percent of their respondents believe AI improves worker performance and creates jobs.[citation needed] is a computer program or an artificial intelligence which conducts a conversation via auditory or textual methods. Such programs are often designed to convincingly simulate how a human would behave as a conversational partner, thereby passing the Turing test. Chatbots are typically used in dialog systems for various practical purposes including customer service or information acquisition. Some chatterbots use sophisticated natural language processing systems, but many simpler systems scan for keywords within the input, then pull a reply with the most matching keywords, or the most similar wording pattern, from a database. The term "ChatterBot" was originally coined by Michael Mauldin (creator of the first Verbot, Julia) in 1994 to describe these conversational programs.Today, most chatbots are either accessed via virtual assistants such as Google Assistant and Amazon Alexa, via messaging apps such as Facebook Messenger or WeChat, or via individual organizations' apps and websites. Chatbots can be classified into usage categories such as conversational commerce (e-commerce via chat), analytics, communication, customer support, design, developer tools, education, entertainment, finance, food, games, health, HR, marketing, news, personal, productivity, shopping, social, sports, travel and utilities. Background
bhaveshjaggi / PestDetectionPEST DETECTION USING IMAGE PROCESSING e The principal idea which empowered us to work on the project PEST DETECTION USING IMAGE PROCESSING is to ensure improved and better farming techniques for farmers. Our Solution: The techniques of image analysis are extensively applied to agricultural science, and it provides maximum protection to crops and also much less use of pesticides which can ultimately lead to better crop management and production. The following softwares are required for the project: OpenCV with C++/Python : It is a library which is designed for computational efficiency with a strong focus on real time applications. Pest Detection System Following are the image processing steps which are used in the proposed system. >Color Image to Gray Image Conversion Therefore, images are converted into gray scale images so that they can be handled easily and require less storage. The following equation shows how images are converted into gray scale images. I(x,y)=0.2989*B +0.5870*G +0.1140*B > Image Filtering The PSNR value is calculated for both the average and median resulting images .The average filter provides better result as compared to the median filter. So this paper uses average filter for further processing. > Image Segmentation To detect the pests from the images, the image background is calculated using morphological operators which is most critical after this image is subtracted from the original image. So the resulting image will only have the objects with pixel values 1 and background pixel values 0. >Noise Removal Noise contains dew drops, dust and other visible parts of leaves. As only the object of interest was to be visible on the images,so the aim was to remove the noise to get better and effective results. The Erosion algorithm has been used to remove isolated noisy pixels and to smoothen object boundaries . After noise removal,the next goal was to enhance the detected pests after segmentation which was performed by using the dilation algorithm. >Feature Extraction Different properties of the images are calculated on the basis of those attributes using which image is classified. For image properties, gray level co-occurrence matrix and regional properties of the images are calculated. These properties are used to train the support vector machine to classify images. >Counting of the pests on the leaves is the main purpose, so that it can give an idea of how much pests are there on a leaf.It uses Moore neighborhood tracing algorithm and Jacob's stopping criterion Feasibility: The present framework of pest detection is quite tedious and laborious for the farmers as they have to carry out their acre-acres surveys themselves and it requires a lot of vigorous efforts to achieve the same.Image analysis provides a realistic opportunity for the automation of insect pest detection.Through this system, crop technicians can easily count the pests from the collected specimens, and right pests’ management can be applied to increase both the quantity and quality of production. Using the automated system, crop technicians can make the monitoring process easier. So in order to bring enhancements in the system,we came up with more productive and well organised system with our idea .Due to this automaton applied,lucrativeness increases and labour is reduced.
topdsoft / MarketplaceCakePHP based marketplace software for classified ads. (like craigslist)
geodesicsolutions-community / Geocore CommunityGeoCore Community, open source classifieds and auctions software
AmandaBoatswain / Cv Based Yield MonitorCode developed for my Masters thesis project. A machine vision-based yield monitor was designed to perform identification, size categorization and continuous counting of shallot onions in-situ during the harvesting process. The system is composed of a video logger and global positioning system (GPS), coupled with computer software developed in Python. Computer vision analysis is performed within the tractor itself while an RGB camera positioned directly above the harvesting conveyor collects real time video data of the crops under natural sunlight conditions. Vegetables are segmented using Watershed segmentation, detected on the conveyor and then classified by size.
MahsaSinaei / Malware Detection By System Call Graph Using Machine LearningUse a system call dependency graph to detect malware and analyze their behavior. The system calls are extracted and collected by Fredrickson and et al.[1] it contains two sets of benchmarks: the malware and the regular software set. The malware set comprises 2631 samples pre-classified into 48 families and 11 types. The regular software set comprises 35 samples. A dependency graph is built from these system calls and a set of features for each software is extracted to specify the software behavior. A feature selection method is implemented to reduce the number of features by clustering them. Machine learning algorithms such as Decision Tree, Random Forest, K-Nearest Neighbors, Support Vector Machines, and Neural Networks are exploited to build two prediction models. The first model is a two-class model that classifies software into malware and regular software. The second model is a multi-class model, which identifies the type of malware, in addition to classifying the software to malware and regular software. [1] Matt Fredrikson, Somesh Jha, Mihai Christodorescu, Reiner Sailer, and Xifeng Yan. Synthesizing near-optimal malware specifications from suspicious behaviors. In Security and Privacy (SP), 2010 IEEE Symposium on, pages 45–60. IEEE, 2010.
JakeDug / DatascienceprojectRepository for 4th Year Software Development data science project. For this project we developed a system that will allow a user to upload x-ray images and the images will be classified as having pneumonia or not having pneumonia.
abhishekjain1991-zz / SDN SimulationSDN vs Traditional Networks Traditional networks were designed to forward packets from source to destination using the shortest route possible. Routers and switches were mostly agnostic to the applications being served by the network. Software-defined network (SDN) architecture allows service providers to build networks with increased application awareness, which can be built into the network by developing SDN controller applications that keep track of application-level characteristics and use that intelligence to provision flow into the network switches.Application awareness is the result of gained intelligence about Layer 4 to Layer 7 protocol attributes and delivery requirements. Software-defined network (SDN) architecture allows service providers to build networks with increased application awareness, which can be built into the network by developing SDN controller applications that keep track of application-level characteristics and use that intelligence to provision flow into the network switches. Methodology When the client transmits an IP packet, the switch inspects the packet and depending on the policy/rule installed in it, forwards the packet on a particular route. If the switch doesn’t have any policy/rule installed, it sends the packet to the controller. The controller inspects the packet header and/or the payload, determines the type of packet (TCP, UDP, HTTP, etc.), and installs a policy/rule on the switch instructing it to forward packets along a particular route. Our Implementation We have created different paths for TCP, UDP and ICMP traffic. We have further classified TCP traffic in to HTTP and FTP traffic. For each different type of traffic, we have assigned a dedicated path.Our implementation could easily be extended to be adaptive, so that the features of an application-aware network mentioned above could be realized.