Final answer:
In PySpark, you would use the command 'SELECT * FROM employees' to query the Delta table employees. You need to execute this command within the Spark SQL context after ensuring the SparkSession is instantiated and the table is registered.
Step-by-step explanation:
In PySpark, to query the Delta table employees created in Spark SQL, you would typically use the SQL query option. The correct command would be a) SELECT * FROM employees. This command allows you to execute a SQL query using the Spark SQL context integrated within PySpark. You would first need to ensure that the SparkSession is instantiated and that the Delta table is registered as a table or view within the Spark SQL context. Then you can perform your SQL query, which could be executed like this:
spark.sql("SELECT * FROM employees").show()
Option b) "employees" is incorrect because it is just a string literal. Option c) "sql".read("employees") is not valid PySpark syntax. Option d) "employees" is also just a string literal and does not represent any action on its own. Lastly, option e) Spark SQL tables cannot be accessed from PySpark is incorrect as PySpark does indeed support querying Spark SQL tables.