Deep Web Crawling
Simulation human browsing behaviour websites and forums are scanned in order to extract content fitting to specific search parameters.
Link structures are followed into depth incl. automatic login, bypassing ip and country locks, solving captchas, link redirects and camouflage mechanisms for hidden content.
Internet Forensics
Based on technical investigations we answer questions about how is responsible for online content, who is providing the information technically and where is the content being stored.
Special needs may be met by additional social engineering.
Big Data Analysis
Unstructured data from heterogeneous sources is normalized and put into pre-definied structures for categorization, evaluation and approval.
A combination of centralized definition of rules, decentralized editing clients and expert system learning routines optimizes human machine interaction.
Automated Online Research
Search engines are automatically fed with optimized query variations, the provider´s ranking of results as well as restrictions the amount of of results are circumvented.
Regular research tasks are performed by parallel batch processing, different perspectives depending on querying country can be made visible.
Automated IT Workflows
Our specialized scripting environment featuring pre-defined in- and output formates along with a specialized frameset of .NET classes enables rapid development of multithreaded command line tools which can be deployed and scheduled on-the-fly for parallel computing in the cloud.