From 3e852e551b41f4c52045b35a6f8ea7645dd16000 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Moritz=20H=C3=B6lting?= <87192362+moritz-hoelting@users.noreply.github.com> Date: Wed, 8 May 2024 15:26:12 +0200 Subject: [PATCH] add chapter "how it is made" to mensa-upb-cli project --- src/content/projects/mensa-upb-cli/index.md | 64 +++++++++++++++++++++ 1 file changed, 64 insertions(+) diff --git a/src/content/projects/mensa-upb-cli/index.md b/src/content/projects/mensa-upb-cli/index.md index 6abe7eb..0078b60 100644 --- a/src/content/projects/mensa-upb-cli/index.md +++ b/src/content/projects/mensa-upb-cli/index.md @@ -46,3 +46,67 @@ It works by parsing the command-line arguments with [clap](https://crates.io/cra ```bash mensa-upb-cli -p student ``` + +## How it is made + +My university has multiple cafeterias and a website with the menu of each one. I did not like checking multiple pages when choosing what and where to eat. + +Therefore I decided to build an application that would make this process easier. My solution had 4 steps: + +1. Read the user input +2. Fetch the data +3. Filter the data based on user input +4. Output the data in a readable way + +### 1. Reading user input + +For reading the cli arguments, I choose [clap](https://crates.io/crates/clap) which is a popular library and very easy to use by deriving the traits. + +```rust +#[derive(Parser)] +#[command(author, version, about, long_about = None)] +struct Cli { + /// Choose the mensa + #[arg(short, long, value_enum, default_values_t = [Mensa::Forum, Mensa::Academica])] + mensa: Vec, + /// Choose the price level + #[arg(short, long)] + price_level: Option, + /// Choose how many days in the future to fetch + #[arg(short, long)] + days_ahead: Option, + /// Filter by extras + #[arg(short, long)] + extras: Vec, +} +``` + +### 2. Fetch the data + +Because there is no API for our cafeteria, I had to scrape the website. The tools I used are [reqwest](https://crates.io/crates/reqwest) for fetching the html and [scraper](https://crates.io/crates/scraper) for extracting the required information from the html. + +### 3. Filter the data based on user input + +Filtering is done at multiple places: + +- only fetch and parse the pages of the cafeterias requested and the day selected +- filter which price level to show (student, employee, guest) +- filter which meals match the selected extra (vegetarian, vegan) + +### 4. Output the data in a readable way + +For readability, I choose to display the meals in a table. Also I wanted to group the meals by categorie (main dishes, side dishes and desserts). A nice library I found for printing tables to the terminal is [comfy-table](https://crates.io/crates/comfy-table). It allows customizing the borders and alignment of each cell. + +```rust +let mut desserts_row = Row::new(); +desserts_row.add_cell( + Cell::from("Desserts") + .set_alignment(CellAlignment::Center) + .add_attribute(comfy_table::Attribute::Underlined) + .add_attribute(comfy_table::Attribute::OverLined), +); +``` + +## Source Code + +If you want to take a look at the source code, it is available on my [GitHub](https://github.com/moritz-hoelting/mensa-upb-cli). You could even try to adapt the web requesting and scraping to the cafeteria you go to.