229,053 views
14 votes
14 votes
Create a program that uses a Java method to determine the length of a line by inputting the X,Y coordinates of the line endpoints. Show the result with a precision of four decimal places.

User Siavash Alp
by
2.7k points

1 Answer

20 votes
20 votes

Answer:

import java.lang.Math;

import java.util.Scanner;

class lineLength {

private double x1;

private double x2;

private double y1;

private double y2;

public lineLength(double x1, double y1, double x2, double y2){

this.x1 = x1;

this.x2 = x2;

this.y1 = y1;

this.y2 = y2;

}

public String toString(){

double length = Math.sqrt( ( ( this.x2 - this.x1 ) * ( this.x2 - this.x1 ) ) + ( ( this.y2 - this.y1 ) * ( this.y2 - this.y1 ) ) );

String answer = String.format("%.4f %n", length);

return answer;

}

}

class Main{

public static void main(String[] args) {

Scanner scanner = new Scanner(System.in);

System.out.println("Please input x1, y1, x2, and y2");

double x1 = scanner.nextDouble();

double y1 = scanner.nextDouble();

double x2 = scanner.nextDouble();

double y2 = scanner.nextDouble();

lineLength test = new lineLength(x1,y1,x2,y2);

System.out.println(test);

}

}

Step-by-step explanation:

we use formula


√((x_2-x_1)^2+(y_2-y_1)^2)

User Moltarze
by
3.0k points