Hey There! How Can We Help?
Frequently Asked Questions
What does HAIVO stand for?
HAIVO is not an acronym but a word that represents the connection in a farm tractor that allows the work to move smoothly. 
And this is what HAIVO does. The humans teach the machines to identify images and label them accordingly in order for the machines to be able to do human work smoothly.
When was HAIVO established?
HAIVO is a service by B.O.T (Bridge.Outsource.Transform) that was first thought of back in 2020. 
However due to some changes in the country and the worldwide pandemic it came to life in 2022.
What types of data do you support?
Imagery, including satellite imagery, video, text, audio.
Common formats: jpeg, png, geotiff, json, yolo, csv, zip, rar
Common output formats: JSON, COCO, YOLO, Pascal VOC, CSV
What types of data transfer do you offer?
Provided you use the platform of our choice (Taqadam.io) we offer two types of data transfer:

The data is either uploaded directly to Taqadam's cloud (GCP) and retrieved by individual users, which can only access one image at a time using a phone application (i.e. no user can scrap or access the full dataset);

OR

The data is provided in the form of URL links with a token as a proxy (the link expires after a period of time), and the data is not transferred to Taqadam’s GCP at any point in time. 
This is a highly secure way to use in cases where a client does not want the data to leave a specific territory.
What type of security measures do you use to prevent loss of confidential data?
All the project and client data are strictly confidential, and only accessible by authorized personnel with relevant permissions (B.O.T / HAIVO project manager and IT personnel). 

All the B.O.T employees and annotators are required to sign an NDA.
 
All the annotators go through mandatory training focused on data privacy and security.
 
For projects with strict confidentiality requirements, we use tools that do not require data storage on the B.O.T cloud or workers' devices. The data is accessed by the annotators from a secured proxy server connected to a client's server.
Can you describe the project training process for annotators, QA's, admins?
1 -Pre-project AI training
Prior to being accepted to work on any project with B.O.T, all the workers (annotators) are required to take purpose-built proprietary 4 weeks training "Data annotation for AI", developed by DOT Lebanon in partnership with Microsoft and sponsored by UNICEF.
 
The training is delivered by AI industry experts and university professors. Those who pass the exam after the training get registered in the pool of available workers that can be called to work on a project.
 
Each worker is assigned a score by a trainer, based on multiple factors essential to deliver data annotation work. This score is updated after every project the worker participates in and is used by Project Managers to create efficient teams.
 
2 - Project specific training
 
For every project the workers are required to attend project-specific training, which is based on a real project dataset, covering major concepts, tools and providing real-time feedback and delivered by an experienced Team Leader (TL) or a Project Manager (PM).
 
When a new tool is used, or new features of an existing tool are used for a project, tool-specific training is delivered.
 
After the training, the annotators submit their sample work to TLs for direct feedback. TLs select the best-performing annotators to work on a project while keeping others on "standby".
 
The most efficient and dedicated annotators are promoted to Team Leaders, whose role is to monitor their team members' performance and provide real-time feedback. TLs are responsible to ensure targets and quality requirements are met within their teams.
 
Annotators work in teams of 7-10 people led by a Team Leader, who reports to a PM. 
PMs do spot checks to ensure quality consistency across teams and assure quality and timely delivery.
What is a typical lead time?
We aim to process the incoming requests as soon as possible:
Maximum of 24 hours to respond to an initial request and schedule a call;
Maximum of 24 hours to provide preliminary quotation and time estimate (provided the data sample is received).
How do you estimate the project duration?
Based on a sample data test, we estimate how many images a person can process per hour, and depending on a client's deadlines, we estimate how many workers are required to complete the project within the requested deadlines.
 
We call for training x1.5 the required number of workers plus a Team Leader/ QA/Validator for every 10 workers. During the first 2 days of the project, we run the numbers to make sure the initial estimates were correct, and add more people when necessary.
 
If the project scope is extremely large and requires more workers than are currently available, we do outreach to inactive workers that have been previously trained and are on "standby".
 
Even if the client does not have strict time requirements, we aim to deliver the project as fast as possible, taking into consideration the number of available workers, scope, and complexity of the project.
How do you make sure the project is completed within the planned timeline?
In addition to the workforce requirement estimate described above, general and on-project training, we apply active team management and risk management policies.
 
A Project Manager sets daily and weekly targets for annotators and the Teams (8-10 annotations led by a Team Leader), and runs daily stats to make sure that the targets are met.
 
Team Leaders (TLs) are responsible to ensure that the Team targets are met, and apply active team management - TLs report if any of the team members are not performing, and in that case, a team member is replaced or an additional team member is added.
 
TLs are also responsible to ensure that the annotated data meets quality requirements, they do 100% data quality checks, and in addition to that Project Manager does spot checks to ensure quality consistency across the teams.
 
If we work on a medium to large project, we submit the first batch of the data to the client to make sure all the requirements are met and make adjustments early on to make sure no errors are replicated to the entire dataset.
 
The progress is communicated to the client on a weekly basis (or more often if required) and any issues are addressed in real time.
Why should I work with HAIVO?
Quality and fast annotation: multi-layered QA ensured by effective team structure, and smooth two-way communication flow with a dedicated Project Manager.

Flexible cost structure due to the scalable freelance model with minimal fixed costs. 

Social Impact - we contribute to the SDGs, by increasing youth and women's employment* and the development of the digital economy. 

Task-driven Optimized ROI - our team of expert technical consultants designs the project scope to ensure that the customers are charged on a "delivered-task-basis" (rather than per hour, or per bulk data basis) which optimizes the delivery time and cost structure.

*The services are executed by trained freelancers from remote or low-income communities in Lebanon and Jordan.