Lesson Weekend

Before this week, data for our programs has been stored in our computers' random-access memory, or RAM. Memory (RAM) is a fast, temporary place to store information, but is not suitable for long-term storage. As you undoubtedly experienced for the past two weeks, if you shut down your server and re-launch your application, most of your program's data is gone!

Beginning this week, we will begin persisting data with full-blown databases! We’ll be writing our test data into a postgres database that lives only in the computer’s memory, and then we’ll transition to using a production database.

We'll begin by learning about SQL, and how relational databases work. Then, we'll cover how to setup and configure our very own databases, including best practices for naming and data organization/architecture. Then, we’ll learn how to integrate databases into our Java backed apps. We'll also learn how to retrieve, store, update, and delete database entries from directly within our Spark applications. On top of that, we’ll also learn to work with Objects in Objects in our Spark apps - such as assigning Tasks to Categorys, a skill you can use to further enhance your Blog from last week

Additionally, we'll learn how to update and delete objects in our database from within an application, and how to write tests to properly assert that all database functionality is working correctly. Then, once we feel a little more comfortable, we'll begin to explore more advanced SQL queries to return very specific database entries, or types of information.

By the end of this week, you’ll be able to persist data, handle exceptions, deal with nested data and extended routing, and much more!

Independent Project Objectives

This week's independent project will be reviewed on the following criteria:

  • Do the database table and column names follow proper naming conventions?
  • Is there a one-to-many relationship set up correctly in the database?
  • Is CRUD functionality included for each class in Spark?
  • Are RESTful routes used in Spark?
  • Have all of the standards from previous weeks been met?
  • If prompted, are you able to discuss the flow of your code and concepts behind it with an instructor using the correct terminology?
  • Is the project in a polished, portfolio-quality state?
  • Does the application work as expected?