Your future can take off with the explosive growth of AI Inference and eFPGA
We are hiring for AI Inference & eFPGA
InferX X1: World's fastest and most efficient Edge Inference Accelerator
We have just launched our first inference chip and it is the best in the world for edge inference. We are bringing up neural network models now and moving forward on the steps required for Q2/2021 chip and board production and Inference Compiler availability. We need more people in multiple functions to handle customer demand!
Join our growing team!
You’ll learn faster, have a bigger impact and have more fun at Flex Logix. Work with super talented software developers and hardware engineers who have worked on numerous software projects and on high volume ICs on process nodes from 7nm to 180nm. Our products are multidisciplinary: our hardware and our software are co-designed and must work together well.
We are leaders in embedded FPGA hardware and software (like ARM that enables processors to be integrated in ICs, we enable FPGA to be integrated: smaller, faster, lower power, lower cost).
And our new AI Inference hardware and software offers superior throughput per $ and per watt for Edge Inference applications. Our first chip and board are working and we are moving into production.
We have grown fast and need more great people to keep up with growing customer demand. We look for people with a proven track record, who are top performers, and who are passionate and entrepreneurial.
We don’t care where you are from (our current team comes from North America, South America, Africa, Europe and Asia) but you must live now in Silicon Valley and preferably be a US Citizen or US Permanent Resident (have a “green card”), but we will also consider H-1B holders. Also, we have multiple women team members including management.
How to Apply
Send your resumes and contact info to email@example.com.
Only apply if you are highly qualified, very smart, super-motivated, willing to work hard and are ready to join a team that will challenge and motivate you.