Solomon Kullback (1907 – 1994) was an American cryptanalyst and mathematician, who was one of the first three employees hired by William F. Friedman at the US Army's Signal Intelligence Service (SIS) in the 1930s, along with Frank Rowlett and Abraham Sinkov. He went on to a long and distinguished career at SIS and its eventual successor, the National Security Agency (NSA). Kullback was the Chief Scientist at the NSA until his retirement in 1962, whereupon he took a position at the George Washington University.
When I was working for BDM (now contained in Northrop Grumman), I had the title of Deputy Directory for Technical Guidance - Operational Test and Evaluation. Our major project was supporting the Army's OTEA (Operational Test and Evaluation Agency). A member of our staff was working on a doctorate in statistics under Solomon Kullback at George Washington University. The student was using a manuscript prepared by S. Kullback. Somehow I got a copy of the manuscript.
A staff member told me that an analyst at OTEA disagreed with one our sample sizing results. I knew enough to know that if one of our results was wrong then all of our results were wrong so I checked it out.
Our methodology was not wrong. I met with the analyst from OTEA. Some face saving terminology was used. I don't remember what it was.
I purchased a copy of Mathematical Theory of Communication by Claude E. Shannon and Warren Weaver (I liked Weaver's part best). The book was published by University of Illinois Press. The paper and/or the book were published in 1948. I bought my copy in the early 1970s.
I got into information theory via Shannon's book and Kullback's manuscript. I have found mathematical probabilities to be simple for me.
While talking with my brother's wife's brother, I couldn't remember Kullback's name. I said that he was in Army communications. Wikipedia says he was with the Army's Signal Intelligence Service (SIS) and its eventual successor NSA. I had no idea that Kullback retired as Chief Scientist of NSA in 1962.
Kullback's statistics and physics entropy have a lot in common.