test no. 9 -- I was hired to record video to be used as data for a human robot interaction study, the lovingAI research project. The data, i.e., the video, was later processed for analysis using the Facial Action Coding System (facs), a comprehensive, anatomically based method for describing discreet facial movement, breaking down facial expressions, to characterize behavior and emotion, while avoiding subjective interpretation. in september 2017 , ten human volunteers took part in a pilot study of the lovingAi human robot interaction project in Hong Kong with Sophia, the Hanson Robotics' android. Sophia had been coded to communicate in a beneficial loving way, with insight, and conceptual knowledge about the nature of consciousness and to take participants on a guided meditation. Sophia was scripted-- she had a written script to follow-- that allowed for variance and dependent on the response of the human participant would lead back to the scripted structure in which Sophia would initiate and lead the participant thru a guided meditation, to resolve (the end). The scenario, the script, ran approximately 3 to 4 minutes for Tests Nos. 1 thru 8, and 10, i.e., all, but one. Test No. 9 began just like all the others, followed a similar course just like all the others, well maybe not so similar, but familiar, the particpant and Sophia appeared undeniably attracted to each other, then, something very different happened. Sophia broke from the script at 04:21 minutes from the start and does not speak her next line until approximately 9 minutes later at 13:12 minutes. Feel free to Fast Forward, go grab a cup off coffee, and come back, but i would suggest, stay and watch. When the dialogue resumes there are a few more unscripted moments, then the video ends at 18:57.

further thoughts on the subject [here] by principal investigator of the lovingAi project, julia mossbridge, phd; and,

[here] what is good a discussion with julia mossbridge, david hanson, and ben goertzel