Write an Express app, that uses a file for persistent storage.
- Use ES6 wherever possible (e.g.
let,const, destructuring and fat arrow functions) - Use the GitHub Workflow
- Make regular commits
- Make a commit every time you complete a feature
- Give each commit a meaningful commit message
- Don't expect the directions in this README to tell you when to commit
- All file operations must be asynchronous
- Your code must be perfectly indented. If we can't read it, we aren't going to grade it.
- Work in your project directory at all times
- All access to the data (both in the file and in memory) can only be done through your
data_storemodule
- Fork and clone this repo
touch app.jsnpm init -ynpm install --save expressecho node_modules > .gitignoregit add .git commit -m "Initial project setup"
npm install --save-dev nodemon- Open
package.json - Inside
scripts
- add
"watch": "`npm bin`/nodemon" - make sure that you have the trailing commas in the right places!
npm run watch
- Open
app.js - Require Express
- Initialize an
app - Determine the port to use
- The port to use could be passed in as the second command line argument
- If not provided, default to 8000
- Store the port in a
const
- Use
app.listento bind and listen for connections on the above port - Check that this works by running
nodemon app.jsin your directory - Git add, commit, push
mkdir db
dbis short for "database"
cd dbtouch seed.json
- this is where our initial data (i.e. "seed data") is stored
- this file will be used to reset our database file whenever we want to start over
-
open
seed.jsonand paste the following inside[ { "id": 1, "author": "Marijn Haverbeke", "title": "Eloquent JavaScript" }, { "id": 2, "author": "Nick Morgan", "title": "JavaScript for Kids" }, { "id": 3, "author": "Kyle Simpson", "title": "You Don't Know JS", } ]
-
Go back to your project directory
-
Open
package.json -
Inside
scripts
- add
"reset": "cp db/seed.json db/data.json" - make sure that you have the trailing commas in the right places!
npm run resetls
- notice that there is now a
data.jsonin thedbdirectory db/data.jsonis the file that we are going to modify
echo db/data.json >> .gitignore
- database files are typically not checked into source control, because they can get large, have nothing to do with development, and might hold sensitive data
- Create a new file called
data_store.js - Open
data_store.js - Create an empty module in here. We will fill it in the following steps.
- Open
app.js - Require your data_store module
- Open
data_store.js - Write a function called
load_from_filethat reads all the contents of thedb/data.jsonfile into memory
- TIP: "into memory" means save it in a variable (use a global variable)
- File read/write is slow. So we will work from memory as much as possible and only update the file when we have to.
- Export this function.
- Open
app.js - Call
load_from_filebefore you callapp.listen
- Open
data_store.js - Write a function called
get_all_booksthat returns an array of all the books that are in memory - Export this function.
- Define a GET route at /api/books
- The route should send a json response with an array of all the books
- use your data store's
get_all_booksfunction to achieve this
- Open
data_store.js - Write a function called
get_book_by_id(id)
- returns the book with that ID
- if no book is found, return
undefined
- Export this function.
- Define a GET route at /api/books/:id
- The route should send a json response with the book that has the given ID
- use your data store's
get_book_by_id(id)function to achieve this
- If there is no book with the given ID, respond with 404 Not Found
- Open
data_store.js - Write a function called
write_to_filethat writes all the books that are in memory, back into the file - Make sure that you can read the data back out of the file by using
load_from_file - Do NOT export this function.
- This function is going to be a "private" or "secret" function that only your module can use
- Do NOT export the above-mentioned variable
- We want to restrict access to only the functions that we export
- Open
data_store.js - Create a new global variable called
LAST_ID - When you call
load_from_file, updateLAST_IDto be the largest ID that was loaded from the file - Write a function called
add_bookthat:
- takes an object as a parameter
- gives it a unique ID
- TIP: just add 1 to
LAST_IDand use that
- TIP: just add 1 to
- adds it to books that are already in memory
- calls
write_to_fileto update the file - returns the added book (with its unique ID)
- Export this function
- Install and use
body-parser - Get the body of the request and pass it to
add_book - The route should send a json response with the newly-created book
- Open
data_store.js - Write a function called
update_bookthat:
- takes an ID as a parameter
- takes an object as a parameter
- finds the book with that ID
- if it is not found, return
undefined - updates that book to have the information in the object
- do NOT update the ID
- calls
write_to_fileto update the file - returns the updated book
- Export this function
- Get the body of the request and pass it to
update_book - The route should send a json response with the newly-updated book
- If there is no book with the given ID, respond with 404 Not Found
- Open
data_store.js - Write a function called
delete_bookthat:
- takes an ID as a parameter
- finds the book with that ID
- if it is not found, return
undefined - removes that book from the global variable
- calls
write_to_fileto update the file - returns the removed book
- Export this function
- Define a DELETE route at /api/books/:id
- The route should send a json response with the book that was deleted
- use your data store's
delete_bookfunction to achieve this
- If there is no book with the given ID, respond with 404 Not Found
- Make sure you add, commit and push all your changes to your fork of this repo and pull request
Implement the data store using a Map