Background: Knee osteoarthritis is one of the leading causes of disability worldwide, affecting 3.8 million people around the globe. Despite its prevalence it is still a poorly understood condition for which limited treatments are currently available. Osteoarthritis(OA) usually starts by affecting the articular cartilage covering the surface of bones in the knee joint, ending up involving all tissue types together with the synovial fluid. The way clinicians diagnose this disease is by looking at radiographic images and assigning a severity score from 0 to 4, where 0 stands for a healthy knee and 4 is late stage OA. Current methodologies in automated study and diagnosis of OA have been applied to small datasets made of a few hundred images and tend to involve only the analysis of Posterior-Anterior(PA) view radiographs. In addition, the relationship between structural changes in the joint and symptoms has not been well understood. Aim: The aim of this work was to improve the performance of computer aided diagnosis techniques available when studying knee OA from medical images. These techniques have the chance of helping the experience of people affected by this very common disease by supporting and speeding up the diagnosis so that appropriate counter measures can be taken. There are two main improvements that we propose: first, the incorporation of additional informative data and second the refinement of the machine learning model. Furthermore, we wanted to contribute to the understanding of the relation between what can be seen in medical images and the symptoms that people experience when affected by OA. This has the potential to deepen our understanding of osteoarthritic sources of pain and ultimately can affect the direction of focus in clinical trials. Methods: Random forest based landmark point detectors have been built to find the outlines of the bones in knee joint radiographs. Separate models were built for lateral and PA view images. The found annotations allowed the automatic extraction of measurements associated with the shape, texture and appearance of the bones and their spatial relation within radiographic images. We used these features to perform several OA related classification tasks, including automated diagnosis of structural changes. We proposed an improvement over the classification model previously used with the introduction of what we called "Indecisive Forests" together with ways of optimising such forests once they have already been trained. Finally, a comprehensive exploration on radiographic sources of pain and investigation on whether it is possible to find a clearer relation between images and symptoms was performed. Results: Our lateral knee model was able to perform as well as the PA model in the first experiments and showed high discriminative ability considering that it is not used by clinicians to perform the grading. The combination of features from the two views only marginally improved performance, with the full knee model using appearance features achieving the best overall results. Using the indecisive forest further reduced the number of classification errors on two classification tasks, while the results of the experiments on the proposed optimisation routine did not allow us to conclude on its effectiveness. The radiographic structural changes that can be seen as a source of pain were a combination of lateral and PA manual features. Consistent knee pain showed an improved correlation with manual scores compared to what has been reported in the literature. Conclusions: The results of our work suggest that features extracted from the lateral view are informative and that using multiple views in general helps performance, though not always by a large margin. Predicting future knee pain is the hardest task for the automated models we used. Our indecisive forest based experiments achieved the state of the art on the tasks of interest, though at the expense of a higher computational costs. We presented the highest correlation between radiographic features and frequent knee pain when evaluated with AUC.