Final answer:
To determine word length frequencies and the total length of English alphabetic characters in an input file, you can use MapReduce programs in Hadoop. The Map function would tokenize the text and the Reduce function would aggregate the results.
Step-by-step explanation:
Finding Word Length Frequencies
To determine the frequency of word lengths within an input file, you can use a MapReduce program in Hadoop. The Map function would tokenize the input text and emit key-value pairs, with the key being the length of the word and the value being 1. The Reduce function would then sum up the values for each key and output the word length frequencies.
Calculating Total Length of English Alphabetic Characters
To determine the total length of English alphabetic characters in the input file, you can again use a MapReduce program. The Map function would filter out non-alphabetic characters and emit key-value pairs, with the key being a fixed value (e.g., 'total') and the value being the length of the filtered word. The Reduce function would then sum up all the values and output the total length.
Note:
You would need to implement the Map and Reduce functions in a language supported by Hadoop, such as Java or Python.