lakeFS is an open source layer that delivers resilience and manageability to object-storage based data lakes.
With lakeFS you can build repeatable, atomic and versioned data lake operations - from complex ETL jobs to data science and analytics.
lakeFS supports AWS S3 or Google Cloud Storage as its underlying storage service. It is API compatible with S3, and works seamlessly with all modern data frameworks such as Spark, Hive, AWS Athena, Presto, etc.
For more information see the Official Documentation.
Welcome Hacktoberfest participants! We commit to actively seek, help, and merge your improvements to lakeFS. We've labelled some issues with the hacktoberfest label. Please check out our contributing guide.
We know you like badges, stickers, and T-shirts, because we like them too! But, like many other open-source projects, we are seeing an influx of lower quality PRs. During October will be unable to accept PRs if they:
- Only change punctuation or grammar, unless accompanied by an explanation or are clearly better.
- Repeat an existing PR, or try to merge branches authored by other contributors that are under active work.
- Do not affect generated code or documentation in any way.
- Are detrimental: do not compile or cause harm when run.
- Change text or code that should be changed upstream, such as licenses, code of conduct, or React boilerplate.
We shall close such PRs and label them x/invalid
; Digital Ocean will not count those PRs
towards Hacktoberfest progress, so such PRs only waste your time and ours.
You can help us accept your PR by adding a clear title and description to the PR and to commits in that PR. "Fixes #1234" or "update README.md" are not as good as "Make lakeFS run 3x faster" or "Add update regarding Hacktoberfest". Communication is key: If you are uncertain, please open a discussion: ask us on the PR or on the issue.
Thanks!
Development Environment for Data
- Experimentation - try tools, upgrade versions and evaluate code changes in isolation.
- Reproducibility - go back to any point of time to a consistent version of your data lake.
Continuous Data Integration
- Ingest new data safely by enforcing best practices - make sure new data sources adhere to your lake’s best practices such as format and schema enforcement, naming convention, etc.
- Metadata validation - prevent breaking changes from entering the production data environment.
Continuous Data Deployment
- Instantly revert changes to data - if low quality data is exposed to your consumers, you can revert instantly to a former, consistent and correct snapshot of your data lake.
- Enforce cross collection consistency - provide to consumers several collections of data that must be synchronized, in one atomic, revertible, action.
- Prevent data quality issues by enabling
- Testing of production data before exposing it to users / consumers.
- Testing of intermediate results in your DAG to avoid cascading quality issues.
-
Ensure you have Docker & Docker Compose installed on your computer.
-
Run the following command:
curl https://compose.lakefs.io | docker-compose -f - up
-
Open http://127.0.0.1:8000/setup in your web browser to set up an initial admin user, used to login and send API requests.
-
Ensure you have Docker installed.
-
Run the following command in PowerShell:
Invoke-WebRequest https://compose.lakefs.io | Select-Object -ExpandProperty Content | docker-compose -f - up
-
Open http://127.0.0.1:8000/setup in your web browser to set up an initial admin user, used to login and send API requests.
Alternatively, you can download the lakeFS binaries and run them directly.
Binaries are available at https://github.com/treeverse/lakeFS/releases.
Please follow the Guide to Get Started to set up your local lakeFS installation.
For more detailed information on how to set up lakeFS, please visit the documentation.
Stay up to date and get lakeFS support via:
- Slack (to get help from our team and other users).
- Twitter (follow for updates and news)
- YouTube (learn from video tutorials)
- Contact us (for anything)
- lakeFS documentation
- If you would like to contribute, check out our contributing guide.
lakeFS is completely free and open source and licensed under the Apache 2.0 License.