r more docs

This commit is contained in:
Koleh David 2025-02-06 16:18:37 +10:30
parent f75a8b24e5
commit ceb46f6bb1

View File

@ -1,5 +1,11 @@
This is a refactoring and implementation kata for the [Gilded Rose Kata](https://github.com/emilybache/GildedRose-Refactoring-Kata).
In the main branch, I've used the lift up conditional refactoring, inspired by Emily Bache and Llewellyn Falco.
This is my implementation of the [Gilded Rose Kata](https://github.com/emilybache/GildedRose-Refactoring-Kata).
## Tidy/refactor -> behavior change
`/main`
In the main branch is simply a refactoring, with the new requirement yet to implemented.
I use the lift up conditional refactoring, inspired by Emily Bache and Llewellyn Falco.
See their videos here:
@ -8,9 +14,17 @@ TODO: link vids
- Emily Bache:
- Llewellyn Falco:
`/feature/implement-new-requirement`
In this branch, I have implemented the new requirement.
`/archive/original-code`
This branch contains the original code, before the refactoring.
# Getting setup
Firstly, setup a python 3.12 environment using [venv](https://docs.python.org/3/library/venv.html)
Firstly, setup a python 3.12 environment using [venv](https://docs.python.org/3/library/venv.html)
Then, install the requirements:
@ -20,7 +34,8 @@ pip install -r requirements.txt
# Running the tests
This project uses [characterisation tests](TODO: Link to characterisation/approval tests concept) to ensure the behavior of the code remains the same during refactoring.
This project uses [characterisation testing](https://www.youtube.com/watch?v=me-Nikc5eak) to ensure the behavior of the code remains the same during refactoring.
Specifically, it uses [pytest-approvals](https://github.com/approvals/ApprovalTests.Python)
```bash
@ -31,6 +46,7 @@ coverage report --fail-under=100
This will run the tests, and will also:
- automatically bring up [diffuse](TODO: link to) to view differences side by side, if there are any behavior changes
- automatically bring up [diffuse](https://diffuse.sourceforge.net/download.html) to view differences side by side, if there are any behavior changes.
- feel free to install and use a diff reporter of your choice. See pytest docs for more info on configuring reporters: [selecting a reporter](https://github.com/approvals/ApprovalTests.Python?tab=readme-ov-file#selecting-a-reporter)
- warn you if your coverage falls below 100% (either you need more tests, or there is dead code)
- generate a coverage report (You can view this in the `htmlcov` directory by opening `index.html` in a browser)