Hadoop Distributed File System (HDFS) based architectures allow faster ingestion and processing of larger quantities of time series data than presently possible in current seismic, hydroacoustic, and infrasonic (SHI) analysis platforms. We have developed a data acquisition and signal analysis system using Hadoop, Accumulo, and NiFi. The data model allows individual waveform samples and their associated metadata to be stored in Accumulo. This is a significant departure from traditional storage practices, where continuous waveform segments are stored with their associated metadata as a single entity. Our design allows for rapid table scans of large data archives within Accumulo for locating, retrieving, and analyzing specific waveform segments directly. The scalability of Hadoop permits the system to accommodate the ingestion and analysis of new data as a sensor network grows. Our system is currently acquiring data from over 200 SHI sensors. Peak ingest rates are approaching 500k entries per second, while preserving constant sub-second access times to any range of entries. The average load produced by the data ingest process is consuming less than 10 percent of available system resources.