Alex Engler - Mentions and Appearances
This could further concentrate power over the future of AI in large technology companies and prevent research that is critical to the public’s understanding of AI. In the end, the [E.U.’s] attempt to regulate open-source could create a convoluted set of requirements that endangers open-source AI contributors, likely without improving use of general-purpose AI.
"Higher education is already suffering from low graduation rates, high student debt, and stagnant inequality for racial minorities—crises that enrollment algorithms may be making worse."
"There is real debate over whether affective computing can do any of the tasks that it is trying to do."
"While there is this emerging understanding of how to regulate AI in some places—like hiring, employee surveillance, and college tuition and acceptance—we need a different way of thinking through what to do in the social media space."
"[America’s Bureau of Labour Statistics] has been grappling with the challenge of a transition to other approaches.”
As we move towards this new algorithmic approach, can we intervene and enable a more fair and more effective set of algorithms to ‘win’?
“If it’s harder to make a whole bunch of money as a snake oil company, there’ll be fewer of them. The more responsible companies will pick up that market share, and everybody wins.”
“There are parts that machine learning can probably help with, but fully automated interviews, where you’re making inferences about job performance—that’s terrible. Modern artificial intelligence can’t make those inferences.”
On the lack of transparency around algorithms: For companies, it’s often “an intentional move to centralize power, save them money and … deflect accountability. And as a customer, you’re losing influence, because it’s harder to interrogate the process.”
“The fact that there are technologists on the transition teams means that the whole process of rebuilding the government includes technology from the start.”