The Dove IO Data Platform allows customers to process multimedia data, providing real-time identification of objects, people, text, scenes, and activities in audio, image and video formatted data. The platform leverages NiFi to empower customers to build workflows surrounding custom data inputs and transformations.
Builds all data ingestion pipelines using NiFi. Have deployed NiFi clusters to ingest, transform and deliver to data analytics backends serving all purposes of data mediation both for realtime and batch jobs.
GoDataDriven, a Dutch service company in the data science and engineering space, helps customers ingest and process data in real time from the most disparate devices (including but not limited to trains!).
Hashmap uses Apache NiFi to securely collect, transmit, and transform data for ingest and delivery into our IoT / Time Series Accelerator platform, allowing for outcome-based, real time analytics and visualization of oil & gas, utilities, manufacturing, industrial, retail, pharma, and process control data. Additionally, we are creating a catalog of open source, ready-to-run, industry specific NiFi processors and controller services for protocols like OPC-UA, ETP, WITSML, LAS, and many others.
Hastings Group is a fast growing, agile, digitally focused general insurance provider providing services to the UK car, van, bike and home insurance market. We have strong relationships with all major price comparison websites (PCWs) and utilise Apache NiFi to process and ingest millions of items of data.
Kuehne+Nagel is a global transport and logistics company founded in 1890. We use NiFi as a processing and orchestration tool for the core company workflow - producing and dispatching documents.
We're building all new data ingestion pipelines using NiFi. Existing pipelines are being migrated to NiFi as well. We have deployed NiFi clusters to ingest, transform, and deliver data to various backends like Google Big Query, Amazon Redshift, and Amazon S3.
Uses Apache NiFi to securely and reliably transfer, transform, enrich and deliver billions of individual events per day (i.e. security logs, system metrics, aggregated data sets, etc) across multiple datacenters.
Micron‘s Enterprise Analytics and Data team uses NiFi to acquire worldwide manufacturing data and ingest it into symmetrical Global Data Warehouses. This data is vaulted and exposed via data marts providing our data scientists and business partners a secured global view of our manufacturing processes. For ingestion flows requiring higher performance transformations, we utilize the NiFi Site-to-Site protocol to seamlessly transfer processing to Spark jobs on our Hadoop clusters. We have leveraged NiFi’s REST API to automate the creation and monitoring of new ingestion pipelines which furthers our goal of providing a complete Global Data Warehouse.
Ona is a software engineering and design firm based in Nairobi, Kenya and Washington, DC. Our mission is to improve the effectiveness of humanitarian and development aid by empowering organizations worldwide with collective and actionable intelligence. We use Apache NiFi to ingest, process, and disseminate global health and service delivery data from diverse sources.
Design large scale NIFI clusters for high volume ingest/egress and provide day to day operational support and maintenance.
NiFi primarily serves as our consumer between Kafka and HDFS. NiFi also provides schema validation for event streams while enabling us to modify and republish secure event streams for general use. NiFi extracts and standardizes large datasets from third parties between various sources including HDFS/S3/Kafka/sftp.
Uses Apache NiFi to power active monitoring. As various network devices are being monitored, SNMP as the unifying protocol is used for communication. Apache NiFi is in active query mode to periodically query these devices. Transformation of the SNMP responses and their transfer to HDFS and Elastic are built using Apache NiFi as well.
Think Big‘s open-source data lake management platform Kylo offers a turn-key, enterprise-ready data lake solution that integrates best practices around metadata management, governance, and security gleaned from Think Big’s 150+ big data implementation projects. Kylo uses Apache NiFi as the underlying scheduler and orchestration engine, along with other technologies like Apache Hadoop and Apache Spark.