Want to see e6data in action?

Learn how data teams power their workloads.

Get Demo
Get Demo

Lakehouse Views: July 2024

About the event

Join us for an exclusive in-person event on "Real-time Streaming and Data Lakehouse," hosted by e6data in collaboration with The Big Data Show. This meetup is designed specifically for senior software engineers, data engineers, and data architects, who are constantly looking to optimise their data architecture to make it more price-performant while delivering the best user experience. In this edition, we will be deep-diving into the cutting-edge developments in real-time streaming architecture, focusing on Kafka, Redis, data caching mechanisms, and governance around them. Lakehouse Views is designed to enable fellow data nerds to meet and network and have insightful discussions on the entropic world of data.

Meet the speakers

Vivek Bansal, Senior Software Engineer at Uber

Topic: What makes Kafka & Redis so fast?

Insights into Kafka and Redis’s internal architecture, efficiency, and popularity in the industry as the de facto choice.

Time: 9:00 - 9:45 AM IST

Sudarsan Lakshmi Narasimhan and Faiz Kothari, Senior Engineering team, e6data

Topic: Data caching in Lakehouse query engines

How to use efficient caching mechanisms to reduce costs while ensuring hyper-performance and data freshness.

Time: 9:45 - 10:30 AM IST

Sagar Prajapati, Founder of Geekcoders

Topic: Unity catalog and Delta Lakehouse for data governance

Best practices to use Unity catalog with Delta lake tables for comprehensive data governance of your data assets.

Time: 10:45 - 11:30 AM IST

Vishnu Vasanth, Founder & CEO, e6data

Topic: Demystifying the analytics landscape

Insights into the evolving landscape and emerging use cases centered around data lakehouse architecuture, with emerging players in data catalogs, open table formats, query engines, and more.

Time: 11:45 - 12:30 PM IST

Share on

Get product updates and legal tips straight to your inbox.

This is an exclusive and invite-only event. Please RSVP to reserve your spot below.

Venue:

e6data - Unit No.401, 4th floor Embassy Square, Infantry Road, Vasanth Nagar, Bangalore, India

Date & Time:

27th July 2024 from 8:30 AM to 12:30 PM IST

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Register Now
Register Now

Build future-proof data products

Try e6data for your heavy workloads!

Get Started for Free
Get Started for Free
1

Pick a heavy workload

Choose a common cross-industry "heavy" workload; OR Work with our solution architect team to identify your own.

2

Define your 360° interop

Define all points of interop with your stack: e.g. Catalog, BI Tool, etc. e6data is serverless first and available on AWS/Azure.

3

Pick a success metric

Supported dimensions: Speed/Latency, TCO, Latency Under Load. Pick any linear combination of these three dimensions.

4

Pick a kick off date

Assemble your team (data engineer, architect, devOps) for kickoff from the date of kickoff, and go live in 10 business days.

Frequently asked questions (FAQs)
How do I integrate e6data with my existing data infrastructure?

We are universally interoperable and open-source friendly. We can integrate across any object store, table format, data catalog, governance tools, BI tools, and other data applications.

How does billing work?

We use a usage-based pricing model based on vCPU consumption. Your billing is determined by the number of vCPUs used, ensuring you only pay for the compute power you actually consume.

What kind of file formats does e6data support?

We support all types of file formats, like Parquet, ORC, JSON, CSV, AVRO, and others.

What kind of performance improvements can I expect with e6data?

e6data promises a 5 to 10 times faster querying speed across any concurrency at over 50% lower total cost of ownership across the workloads as compared to any compute engine in the market.

What kinds of deployment models are available at e6data ?

We support serverless and in-VPC deployment models. 

How does e6data handle data governance rules?

We can integrate with your existing governance tool, and also have an in-house offering for data governance, access control, and security.