Skip to content

Commit

Permalink
Readme cleanup
Browse files Browse the repository at this point in the history
  • Loading branch information
scnerd committed Sep 10, 2024
1 parent 852dd76 commit 4eff18e
Showing 1 changed file with 9 additions and 7 deletions.
16 changes: 9 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,17 @@ print(convert_spl_to_pyspark(r"""multisearch
[index=regionA | fields +country, orders]
[index=regionB | fields +country, orders]"""))

# spark.table("regionA").select(F.col("country"), F.col("orders")).unionByName(
# spark.table("regionB").select(F.col("country"), F.col("orders")),
# spark.table("regionA").select(
# F.col("country"),
# F.col("orders"),
# ).unionByName(
# spark.table("regionB").select(
# F.col("country"),
# F.col("orders"),
# ),
# allowMissingColumns=True,
# )

```

## Interactive CLI
Expand Down Expand Up @@ -56,8 +63,3 @@ However, it lays a solid foundation for the whole process and is modular enough

Ways to contribute:
- Add SPL queries and what the equivalent PySpark could would be. These test cases can drive development and prioritize the most commonly used features.
- Add support for additional functions and commands. While the SPL parser works for most commands, many do not yet the ability to render back out to PySpark. Please see `/src/pyspark/transpiler/command/eval_fns.rs` and `/convert_fns.rs` to add support for more in-command functions, and `/src/pyspark/transpiler/command/mod.rs` to add support for more top-level commands.

# Future

- Add UI (with `textual`) for interactive demonstration/use

0 comments on commit 4eff18e

Please sign in to comment.