Uses of Enum
ubic.basecode.ontology.providers.OntologyService.InferenceMode
Packages that use OntologyService.InferenceMode
Package
Description
Implementation of
OntologyService
using Apache Jena.This package contains baseCode built-in ontologies and a
GenericOntologyService
to implement your own ontologies.-
Uses of OntologyService.InferenceMode in ubic.basecode.ontology.jena
Methods in ubic.basecode.ontology.jena that return OntologyService.InferenceModeMethods in ubic.basecode.ontology.jena with parameters of type OntologyService.InferenceModeModifier and TypeMethodDescriptionprotected com.hp.hpl.jena.ontology.OntModelSpec
AbstractOntologyService.getSpec
(OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) protected abstract OntologyModel
AbstractOntologyService.loadModel
(boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) Delegates the call as to load the model into memory or leave it on disk.protected OntologyModel
ClasspathOntologyService.loadModel
(boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) protected OntologyModel
TdbOntologyService.loadModel
(boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) protected OntologyModel
UrlOntologyService.loadModel
(boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) protected abstract OntologyModel
AbstractOntologyService.loadModelFromStream
(InputStream is, boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) Load a model from a given input stream.protected OntologyModel
ClasspathOntologyService.loadModelFromStream
(InputStream is, boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) protected OntologyModel
TdbOntologyService.loadModelFromStream
(InputStream is, boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) protected OntologyModel
UrlOntologyService.loadModelFromStream
(InputStream is, boolean processImports, OntologyService.LanguageLevel languageLevel, OntologyService.InferenceMode inferenceMode) void
AbstractOntologyService.setInferenceMode
(OntologyService.InferenceMode inferenceMode) -
Uses of OntologyService.InferenceMode in ubic.basecode.ontology.providers
Methods in ubic.basecode.ontology.providers that return OntologyService.InferenceModeModifier and TypeMethodDescriptionAbstractDelegatingOntologyService.getInferenceMode()
OntologyService.getInferenceMode()
Obtain the inference mode used for this ontology.Returns the enum constant of this type with the specified name.static OntologyService.InferenceMode[]
OntologyService.InferenceMode.values()
Returns an array containing the constants of this enum type, in the order they are declared.Methods in ubic.basecode.ontology.providers with parameters of type OntologyService.InferenceModeModifier and TypeMethodDescriptionvoid
AbstractDelegatingOntologyService.setInferenceMode
(OntologyService.InferenceMode inferenceMode) void
OntologyService.setInferenceMode
(OntologyService.InferenceMode inferenceMode) Set the inference mode used for this ontology.