Find out how to describe the things around you in another language!
An iOS app using Core ML and the Vision framework in iOS 11 as well as the Google Translate API.
The app also reads the words aloud
- Get a Google Translate API Key
- Copy
Keys.example.xcconfig
toKeys.xcconfig
and replaceYOUR_API_KEY_HERE
in the file with your API key from step 1 - Clean build folder / clear derived data (
⌥⇧⌘K
) - Run the app
The app currently comes with SqueezeNet. You can replace it with another model available on the Apple's Core ML page (ex. ResNet50
, Inception v3
, VGG16
) by dragging the .mlmodel
file (ex. VGG16.mlmodel
) into Xcode where you see SqueezeNet.model
, and then:
In VisionService.swift
, replace the line:
model = try? VNCoreMLModel(for: SqueezeNet().model)
with
model = try? VNCoreMLModel(for: VGG16().model)
or the model of your choice
These are languages that can both be translated by Google Translate and pronounced by iOS (listed in Language.swift
)
- Arabic
- Chinese (Simplified)
- Chinese (Traditional)
- Czech
- Danish
- Dutch
- English
- Finnish
- French
- German
- Greek
- Hewbrew
- Hindi
- Hungarian
- Indonesian
- Italian
- Japanese
- Korean
- Norwegian
- Polish
- Portuguese
- Romanian
- Russian
- Slovak
- Spanish
- Swedish
- Thai
- Turkish