Final answer:
The lateness (li) of a request in the minimize lateness problem is defined as li = max(fi - di, 0), which calculates how much a task has exceeded its deadline, treating any negative values as zero.
Step-by-step explanation:
The minimize lateness problem is an optimization issue in scheduling and computer science. Given a set of requests with associated durations, deadlines, and start times, the lateness of a request is determined after the job has finished. Therefore, lateness (li) of a request is calculated using the formula li = max(fi - di, 0), where fi denotes the finish time of the request and di is the deadline. This measures the amount by which the task has exceeded its deadline, with the understanding that if a task finishes before its deadline, its lateness is considered zero.