Jason’s role in Intel’s Data Center Group, (DCG), focuses on presenting to end users and Intel partners, current industry best practices and trends, in cloud computing. This includes best practices for developing overall cloud architectures and cloud management strategies, as well as designing data center modernization blueprints with software defined infrastructure projects, including software defined compute, storage, and network phases.
Jason has over thirty years of enterprise data center infrastructure industry experience including positions with companies such as Hitachi Data Systems, Brocade Communications Systems, VERITAS Software, and Novell. Jason has published white papers, and articles, as well as developed and delivered presentations to audiences worldwide, on subjects such as cloud computing, cloud infrastructure orchestration, and enterprise compute, storage, and network infrastructure, architecture and design.
Josh is a software engineer at Cloudera, focusing on Apache HBase, Phoenix, Accumulo, and ZooKeeper. In a previous role, he was the engineering manager for the HBase team at Hortonworks.
I make data scale
I am a software engineer on the Azure HDInsight Team working on provisioning and operation of open source software on Azure. Specifically I focus on optimizing HBase and Phoenix on Azure as a managed service. In general, my interests lie in algorithms for optimization problems, performance and scale and big data computation. Prior to joining Microsoft I completed my PhD in Computer Science from the University of Iowa.
I am currently a Engineering Manager at Uber where I am a member of the Hadoop Platform team working on large scale data ingestion and dispersal pipelines and libraries leveraging Apache Spark. I was also previously the tech lead on the metrics team at Uber Maps building data pipelines to produce metrics to help analyze the quality of our mapping data. Before joining Uber, I worked at Twitter as an original member of the Core Storage team building Manhattan, a key/value store powering Twitter's use cases. I love learning anything about storage and data platforms and distributed systems at scale.
Nishith Agarwal is a senior software engineer at Uber, where he works on the Hudi project and the Hadoop platform at large. His interests lie in large-scale distributed and data systems.
Over the past 15 years, Rich Fecher has been solving the hard technical challenges that face the U.S. Defense and Intelligence Communities. Rich has extensive expertise in architecting and building end-to-end systems. His experience ranges from visualization to distributed computing, and he has primarily focused his career toward enriching geospatial content and delivery. Rich has embraced open source technologies as an effective means to provide cutting edge solutions.
Rich has been leading GeoWave development from the project's inception in 2013. He has authored multiple academic papers on the topic to include a contribution to Advances in Spatial and Temporal Databases 2017 (https://link.springer.com/chapter/10.1007/978-3-319-64367-0_6) and a contribution to the Free and Open Source for Geospatial (FOSS4G) Conference Proceedings in 2017 (https://scholarworks.umass.edu/cgi/viewcontent.cgi?referer=&httpsredir=1&article=1027&context=foss4g).
Rich holds an M.S. in Computer Science from George Mason University; he received his post-graduate certificate in GIS from Pennsylvania State University, and received a B.S. in Computer Science with minors in Applied Math and Physics from the University of Virginia.
Software engineer, focused on the application of open source technology to unlock geospatial insight.
Tim Spann was a Senior Solutions Architect at AirisData working with Apache Spark and Machine Learning. Previously he was a Senior Software Engineer at SecurityScorecard ("http://securityscorecard.com/) helping to build a reactive platform for monitoring real-time 3rd party vendor security risk in Java and Scala. Before that he was a Senior Field Engineer for Pivotal focusing on CloudFoundry, HAWQ and Big Data. He is an avid blogger and the Big Data Zone Leader for Dzone (https://dzone.com/users/297029/bunkertor.html).
He runs the the very successful Future of Data Princeton meetup with over 1192 members at http://www.meetup.com/futureofdata-princeton/.
He is currently a Senior Solutions Engineer at Cloudera in the Princeton New Jersey area.
You can find all the source and material behind his talks at his Github and Community blog:
Henry Sowell is Cloudera Technical Director in the Public Sector.
In this capacity, Mr. Sowell leads an engineering group responsible for the technical architecture and engineering of Big Data solutions supporting missions across the Intelligence Community, Department of Defense, Federal Civilian Government Agencies, and State, Local, and Higher Education institutions, helping improve speed to mission.
Prior to joining Cloudera, Mr. Sowell used several technologies, including Apache Hadoop, to protect the nation in support of the FBI’s counterterrorism mission. In addition to supporting the counterterrorism mission, he leveraged these technologies to support cross-division law enforcement advancements with the FBI’s Cyber Division. Mr. Sowell enlisted in the United States Marine Corps in 2003. He served with distinction as a decorated combat veteran, having earned the Bronze Star with Valor for his actions in Iraq.
Wellington Chevreuil has been working in the technology industry for the last 15 years, from purely systems/application development projects, through operations support to product development, being involved with bigdata/nosql since 2011. Currently working as a Software Engineer for Cloudera's HBase development team, trying to help improve the product with various forms of contributions, such as bug fixing, debug-ability/supportability enhancements, documentation and code reviews.
Krishna manages the strategy and direction for Cloudera’s Operational Databases. The main products in his portfolio are Kudu and HBase. Prior to his current position at Cloudera, he was responsible for portfolio strategy and portfolio product management at Dell. He holds an MBA from Harvard Business School, Masters & BS in Computer Engineering & Computer Science from Cornell University.
I am a senior engineer at Bloomberg . In Bloomberg I have worked on many Bloomberg specific tehchnologies which range from networking protocols to sql and nosql databases. Prior to Bloomberg I have worked in Telelcom and overall I have 20+ years of development experience which ranges from firmware, networking protocols, real time OS to distributed applications. My current work is to migrate equity data and applications from Bloomberg specific technologies to HBase and other related open source technologies.
Amit Anand is a senior software developer at Bloomberg on the Hadoop Servics/Infrastructure team, where he is involved in designing and developing tools, that are used by users, around hadoop platform. Amit is involved on the Hadoop Infrastucture team as well, where he is responsible for deployment and management of hadoop clusters. He focuses on HDFS, Yarn, HBase and Spark. He holds a Bachelors in Commerce and a Masters in Computer Science.
Alex is a founding member and an engineering leader of data infrastructure team at Tesla, tasked to build a scalable, self-served, cost-efficient data platforms that cater everything data at Tesla. He is heads down working on enabling Tesla's incredible products that change the world for the better.
Alex is hiring motivated data engineers, and SREs to join his team :)
Ankit Singhal is a committer and a member of Apache Phoenix PMC (Project Managment Committee) for more than 2 years now. He has also been contributing to projects like HBase, Tephra, Calcite, Ranger, Hive . He specializes in designing and developing big data solutions for a different line of business. With over 8 years of Big Data experience, he has architected and created various analytics products and data warehouse solutions using Hadoop technologies like Hadoop, Kafka, Hive, HBase, Phoenix, spark.
Thomas is a Principle Engineer working on the Big Data team at Salesforce. He is a PMC member of Apache Phoenix. He has worked on different areas in Phoenix related to functional indexing, storage formats and scaling the metadata layer.
Apache Phoenix committer, member of the Phoenix PMC, software engineer in the Salesforce Big Data group.
Daisuke has been supporting Cloudera's HBase customers over 7 years and experienced various troubleshooting scenarios.
Toshihiro Suzuki is an HBase committer and at Cloudera as a member of the Support team. As a Sr.Software Engineer (Breakfix), he is responsible for troubleshooting complex support cases and fixing bugs in our products (such as HBase/Phoenix). Prior to joining Cloudera, he was an administrator in charge of operating Hadoop/HBase at Cyberagent, Inc. for five years. He authored a book based on HBase for beginners in Japanese.
Ohad Shacham is Senior Research Scientist at Yahoo Research. He works on scalable big data and search platforms. Most recently, he focused on extending the Omid transaction processing system with high availability and integrating Omid with Apache Phoenix. Ohad received his PhD in concurrent software verification from Tel-Aviv University CS in 2012. Prior to Yahoo, Ohad lead the SAT based formal verification activities at IBM Research and worked on automatic software vectorization at Intel.
Karan Mehta is a Software Engineer at Salesforce and PMC member for Apache Phoenix project. Karan led the effort of improving Phoenix Query Server, which has opened up gates for several new use cases. Prior to Salesforce, Karan received his Masters at University of California, Irvine.
Swaroopa Kadam is a budding software engineer and a committer on Apache Phoenix project. After completing a bachelor's degree, she started her software engineering journey at Oracle India Pvt. Ltd. The penchant to learn more about infrastructure and database systems encouraged her to pursue a Master of Science in Computer Science specialized in Databases and Distributed Systems. Currently, she is working at Salesforce.com in their Big Data and HBase team.
2018.5 ~ now Salesforce (Big Data Group, HBase & Phoenix)
2006.6 ~ 2018.5 Microsoft (public cloud, computation, no-sql store)
Back-end software engineer with 13 years of hands-on experiences in development, design and testing.
1. Working on Apache Phoenix which enables OLTP and operational analytics in Hadoop (https://phoenix.apache.org/). Running on the way to be committer.
2. Working on the full stack of Data Lake and mainly focusing on Job Scheduling, Distributed Query Optimization, NO-SQL Store, Resource Management.
Currently working on Apache Phoenix
Dr. James Hughes is a mathematician at Commonwealth Computer Research, Inc. in Charlottesville, Virginia. He is a core committer for GeoMesa which leverages Accumulo, HBase and other distributed database systems to provide distributed computation and query engines. He is a LocationTech committer for GeoMesa, SFCurve, and JTS. He serves on the LocationTech Project Management Committee and Steering Committee. Through work with LocationTech and OSGeo projects like GeoTools and GeoServer, he works to build end-to-end solutions for big spatio-temporal problems. He holds a PhD in algebraic topology from the University of Virginia.
John Highcock is a Solutions Architect at Cloudera (formerly Hortonworks). Prior to joining, he worked on big data projects at the US Department of Justice.
Clay Baenziger - is an architect of the Hadoop Infrastructure Team at Bloomberg. Clay comes from a diverse background in systems infrastructure and analytics ranging from operating systems engineering to financial portfolio analytics. He has been involved in the Hadoop ecosystem for nine years and provides numerous talks each year on Bloomberg's community contributions.