Jamie Engesser is the Senior Vice President Product Management at Hortonworks. With more than twenty years of professional experience in the software industry, Jamie most recently had global responsibility for Hortonworks Solutions Engineering organization which is focused on guiding organizations to identify their Hadoop opportunity from Business Case, to Proof of Concept, to successful Project Delivery. Prior to Hortonworks, Jamie led Global Solutions Engineering teams at SpringSource and VMware. Jamie has extensive experience spanning Open Source, Java, Platform as a Service (PaaS), Application Infrastructure and Big Data. He holds a Bachelor of Science in Industrial Engineering from Montana State University.
Alan is a founder of Hortonworks and an original member of the engineering team that took Pig from a Yahoo! Labs research project to a successful Apache open source project. Alan is PMC member on Apache Hive, Pig, and many other Apache projects. As part of the Apache Incubator PMC he has mentored many new Apache communities. Alan has a BS in Mathematics from Oregon State University and a MA in Theology from Fuller Theological Seminary. He is also the author of Programming Pig, a book from O’Reilly Press. Follow Alan on Twitter: @alanfgates.
Grant Priestley is currently leading the architecture and implementation of Big Data, machine learning and AI platform at one of Australia's largest telecommunications organisation. He has more than 20 years of IT and business experience and has spent the last 8 years of his life architecting and delivering some of Australia's largest machine learning and advanced analytics platforms. Long-standing experience domestically and internationally in Big Data, machine learning, and AI within FSI, government, retail, and telecommunications, he enjoys driving the execution of architectures that use data and technologies to help companies realise measurable value from analytics machine learning and AI.
Andy LoPresto is a Sr. Member of Technical Staff at Hortonworks working on the Hortonworks DataFlow team. In this role he serves as both a Committer and Project Management Committee Member for Apache NiFi, an open source, robust, secure data routing and delivery system. Andy focuses on security concerns within NiFi including identity management, TLS negotiation, data protection, access control, encryption and hashing. Andy is also involved with the sub-project, Apache MiNiFi, which drives edge data collection, including secure command and control and immediate data provenance and governance. He has presented about NiFi at DataWorks Summits in Singapore, Tokyo, Melbourne, Berlin, Sydney, and San Jose, FOSDEM '17 in Brussels, and the OpenIoT Summit 2017.
Andrew Psaltis is deeply entrenched in streaming and IoT systems and obsessed with delivering insight at the speed of thought. As the author of Streaming Data (http://manning.com/psaltis/) by Manning, an international speaker and trainer he spends most of his waking hours thinking about, writing about, and building streaming systems.When he's not busy being busy, he's spending time with his lovely wife, two kids, and watching as much Lacrosse as possible. He has spoken at Berlin Buzzwords (2014, 2015,2016,2017) and ApacheCon (2015,2017), QCon New York (2016), IoT StampedCon (2017), and Big Data StampedCon (2015,2016,2017), Dataworks Summit Sydney (2018).
Santhosh B Gowda is an Engineering Manager QE at Hortonworks. He takse care of the scale, resiliency and performance aspects for Hortonworks products (i.e. HDP, HDF and Dataplane). He has more than 13 years of IT experience and has focused on Hadoop for the last 2 years. He holds an M.S Software Systems degree from BITS, Pilani.
Abhas, an electrical engineer by training, is a seasoned strategy consultant and a passionate entrepreneur. An ardent innovator, he holds an inimitable experience of incubating a Digital start-up, and successfully growing it as a lean alternative growth engine bringing in revenues in excess of $10M.
Currently he heads up Strategy & Innovation at Hortonworks and helps early stage startups with angel Investments, product prototyping, affiliate partnership strategy, marketing and loyalty biz-stream creation and sustainable 360 degree growth, across both sides of the Atlantic.
He was selected as a Global Shaper by the World Economic Forum, '100 Visionaries under 30' along side the likes of Nobel Laureate Malala Yousafzai by the Real Leaders Magazine, and a Founders of the Future under 35 by the Founders Forum.
Simon is a data scientist, has experience in product management, and has worked for numerous data technology companies, from vendors like Hortonworks to various data users in retail, hedge funds and the web. His focus is on big data, machine learning, and using these technologies to foster results.
Robert Hryniewicz has over 10 years working on various projects related to Artificial Intelligence, Enterprise Software, IoT, Robotics, Blockchain and more. Currently, he’s a Data Scientist and Evangelist at Hortonworks. Previously, Robert was a CTO at a Singularity Labs startup, Sr. Architect at Cisco, NASA et al. He’s a frequent speaker at DataWorks / Hadoop Summits.
Sanjay is founder and chief architect at Hortonworks, and
an Apache Hadoop committer and member of the Apache Hadoop PMC.
Prior to co-founding Hortonworks, Sanjay was the chief architect of core-Hadoop at Yahoo and part of the team that created Hadoop. In Hadoop he has contributed to several areas including HDFS, MapReduce schedulers, Yarn's design, high availability, compatibility, etc.
He has also held senior engineering positions at Sun Microsystems and INRIA, where he developed software for distributed systems and grid/utility computing infrastructures.
Sanjay has a PhD in Computer Science from the University of Waterloo in Canada.
Steven O'Neill began his career as a DBA and has 20 years' experience specialising in Oracle and SQL Server. He started working with Hadoop, using Hortonworks HDP, almost four years ago. As technical lead for Hadoop within the Department of Home Affairs he has overseen the implementation of multiple projects involving data archiving, image storage and text indexing. Steven and his team are currently focusing on analytic workloads, graph storage and analysis, and improvements to streaming ingestion of data into both Hadoop and the Department's Teradata Enterprise Data Warehouse.
Dwane Hall began his career as a Java Developer before moving into a Data Engineering role all within the Department of Home Affairs. He has been working with Hadoop and the Hortonworks HDP for the past two years. As a Hadoop Developer his focus has been on architecting and securing the Departments NiFi and Solr environments and identifying and implementing solutions that utilise the Departments data holdings to support operational decision making.