Merge pull request #3 from qpdb/gburd/learning-by-linting

lint
This commit is contained in:
Gregory Burd 2020-01-31 13:59:38 -05:00 committed by GitHub
commit 18a0c15320
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
52 changed files with 762 additions and 621 deletions

1
.gitignore vendored
View file

@ -92,3 +92,4 @@ build.xcarchive
docs/_site docs/_site
docs/.sass-cache docs/.sass-cache
docs/.jekyll-metadata docs/.jekyll-metadata

View file

@ -1,27 +1,37 @@
language: rust language: rust
cache: cargo # cache cargo-audit once installed
before_script:
# - cargo install --force clippy
- cargo install --force cargo-audit
- cargo generate-lockfile
script:
- cargo audit
# We use OSX so that we can get a reasonably up to date version of SQLCipher. # We use OSX so that we can get a reasonably up to date version of SQLCipher.
# (The version in Travis's default Ubuntu Trusty is much too old). # (The version in Travis's default Ubuntu Trusty is much too old).
os: osx os: osx
before_install: before_install:
- brew install sqlcipher --with-fts - brew install sqlcipher
rust: rust:
- 1.25.0 - 1.41.0
- stable - stable
- beta - beta
- nightly - nightly
matrix: matrix:
allow_failures: allow_failures:
- rust: stable
- rust: nightly - rust: nightly
fast_finish: true fast_finish: true
jobs: jobs:
include: include:
- stage: "Test iOS" - stage: "Test iOS"
rust: 1.25.0 rust: 1.41.0
script: ./scripts/test-ios.sh script: ./scripts/test-ios.sh
- stage: "Docs" - stage: "Docs"
rust: 1.25.0 rust: 1.41.0
script: ./scripts/cargo-doc.sh script: ./scripts/cargo-doc.sh
script: script:
- cargo build --verbose --all
# - cargo clippy --all-targets --all-features -- -D warnings # Check tests and non-default crate features.
- cargo test --verbose --all - cargo test --verbose --all
- cargo test --features edn/serde_support --verbose --all - cargo test --features edn/serde_support --verbose --all
# We can't pick individual features out with `cargo test --all` (At the time of this writing, this # We can't pick individual features out with `cargo test --all` (At the time of this writing, this

View file

@ -28,6 +28,13 @@ members = ["tools/cli", "ffi"]
[build-dependencies] [build-dependencies]
rustc_version = "0.2" rustc_version = "0.2"
[dev-dependencies.cargo-husky]
version = "1"
default-features = false # Disable features which are enabled by default
features = ["run-for-all", "precommit-hook", "run-cargo-fmt", "run-cargo-test", "run-cargo-check", "run-cargo-clippy"]
# cargo audit
# cargo outdated
[dependencies] [dependencies]
chrono = "0.4" chrono = "0.4"
failure = "0.1.6" failure = "0.1.6"

View file

@ -1,17 +1,13 @@
# Project Mentat # Project Mentat
[![Build Status](https://travis-ci.org/mozilla/mentat.svg?branch=master)](https://travis-ci.org/mozilla/mentat) [![Build Status](https://travis-ci.org/qpdb/mentat.svg?branch=master)](https://travis-ci.org/qpdb/mentat)
**Project Mentat is [no longer being developed or actively maintained by Mozilla](https://mail.mozilla.org/pipermail/firefox-dev/2018-September/006780.html).** This repository will be marked read-only in the near future. You are, of course, welcome to fork the repository and use the existing code.
Project Mentat is a persistent, embedded knowledge base. It draws heavily on [DataScript](https://github.com/tonsky/datascript) and [Datomic](http://datomic.com). Project Mentat is a persistent, embedded knowledge base. It draws heavily on [DataScript](https://github.com/tonsky/datascript) and [Datomic](http://datomic.com).
Mentat is implemented in Rust. This project was started by Mozilla, but [is no longer being developed or actively maintained by them](https://mail.mozilla.org/pipermail/firefox-dev/2018-September/006780.html). [Their repository](https://github.com/mozilla/mentat) was marked read-only, [this fork](https://github.com/qpdb/mentat) is an attempt to revive and continue that interesting work. We owe the team at Mozilla more than words can express for inspiring us all and for this project in particular.
The first version of Project Mentat, named Datomish, [was written in ClojureScript](https://github.com/mozilla/mentat/tree/clojure), targeting both Node (on top of `promise_sqlite`) and Firefox (on top of `Sqlite.jsm`). It also worked in pure Clojure on the JVM on top of `jdbc-sqlite`. The name was changed to avoid confusion with [Datomic](http://datomic.com). *Thank you*.
The Rust implementation gives us a smaller compiled output, better performance, more type safety, better tooling, and easier deployment into Firefox and mobile platforms. [Documentation](https://docs.rs/mentat)
[Documentation](https://mozilla.github.io/mentat)
--- ---
@ -77,9 +73,11 @@ We've observed that data storage is a particular area of difficulty for software
DataScript asks the question: "What if creating a database were as cheap as creating a Hashmap?" DataScript asks the question: "What if creating a database were as cheap as creating a Hashmap?"
Mentat is not interested in that. Instead, it's strongly interested in persistence and performance, with very little interest in immutable databases/databases as values or throwaway use. Mentat is not interested in that. Instead, it's focused on persistence and performance, with very little interest in immutable databases/databases as values or throwaway use.
One might say that Mentat's question is: "What if an SQLite database could store arbitrary relations, for arbitrary consumers, without them having to coordinate an up-front storage-level schema?" One might say that Mentat's question is: "What if a database could store arbitrary relations, for arbitrary consumers, without them having to coordinate an up-front storage-level schema?"
Consider this a practical approach to facts, to knowledge its storage and access, much like SQLite is a practical RDBMS.
(Note that [domain-level schemas are very valuable](http://martinfowler.com/articles/schemaless/).) (Note that [domain-level schemas are very valuable](http://martinfowler.com/articles/schemaless/).)
@ -89,7 +87,7 @@ Some thought has been given to how databases as values — long-term references
Just like DataScript, Mentat speaks Datalog for querying and takes additions and retractions as input to a transaction. Just like DataScript, Mentat speaks Datalog for querying and takes additions and retractions as input to a transaction.
Unlike DataScript, Mentat exposes free-text indexing, thanks to SQLite. Unlike DataScript, Mentat exposes free-text indexing, thanks to SQLite/FTS.
## Comparison to Datomic ## Comparison to Datomic
@ -98,8 +96,6 @@ Datomic is a server-side, enterprise-grade data storage system. Datomic has a be
Many of these design decisions are inapplicable to deployed desktop software; indeed, the use of multiple JVM processes makes Datomic's use in a small desktop app, or a mobile device, prohibitive. Many of these design decisions are inapplicable to deployed desktop software; indeed, the use of multiple JVM processes makes Datomic's use in a small desktop app, or a mobile device, prohibitive.
Mentat was designed for embedding, initially in an experimental Electron app ([Tofino](https://github.com/mozilla/tofino)). It is less concerned with exposing consistent database states outside transaction boundaries, because that's less important here, and dropping some of these requirements allows us to leverage SQLite itself.
## Comparison to SQLite ## Comparison to SQLite

View file

@ -14,7 +14,7 @@ use std::process::exit;
/// MIN_VERSION should be changed when there's a new minimum version of rustc required /// MIN_VERSION should be changed when there's a new minimum version of rustc required
/// to build the project. /// to build the project.
static MIN_VERSION: &'static str = "1.41.0"; static MIN_VERSION: &str = "1.40.0";
fn main() { fn main() {
let ver = version().unwrap(); let ver = version().unwrap();

View file

@ -102,7 +102,7 @@ impl<V: TransactableValueMarker> Into<ValuePlace<V>> for KnownEntid {
/// When moving to a more concrete table, such as `datoms`, they are expanded out /// When moving to a more concrete table, such as `datoms`, they are expanded out
/// via these flags and put into their own column rather than a bit field. /// via these flags and put into their own column rather than a bit field.
pub enum AttributeBitFlags { pub enum AttributeBitFlags {
IndexAVET = 1 << 0, IndexAVET = 1,
IndexVAET = 1 << 1, IndexVAET = 1 << 1,
IndexFulltext = 1 << 2, IndexFulltext = 1 << 2,
UniqueValue = 1 << 3, UniqueValue = 1 << 3,
@ -327,20 +327,20 @@ impl ValueType {
pub fn from_keyword(keyword: &Keyword) -> Option<Self> { pub fn from_keyword(keyword: &Keyword) -> Option<Self> {
if keyword.namespace() != Some("db.type") { if keyword.namespace() != Some("db.type") {
return None; None
} else {
match keyword.name() {
"ref" => Some(ValueType::Ref),
"boolean" => Some(ValueType::Boolean),
"instant" => Some(ValueType::Instant),
"long" => Some(ValueType::Long),
"double" => Some(ValueType::Double),
"string" => Some(ValueType::String),
"keyword" => Some(ValueType::Keyword),
"uuid" => Some(ValueType::Uuid),
_ => None,
}
} }
return match keyword.name() {
"ref" => Some(ValueType::Ref),
"boolean" => Some(ValueType::Boolean),
"instant" => Some(ValueType::Instant),
"long" => Some(ValueType::Long),
"double" => Some(ValueType::Double),
"string" => Some(ValueType::String),
"keyword" => Some(ValueType::Keyword),
"uuid" => Some(ValueType::Uuid),
_ => None,
};
} }
pub fn into_typed_value(self) -> TypedValue { pub fn into_typed_value(self) -> TypedValue {
@ -372,9 +372,9 @@ impl ValueType {
} }
} }
pub fn is_numeric(&self) -> bool { pub fn is_numeric(self) -> bool {
match self { match self {
&ValueType::Long | &ValueType::Double => true, ValueType::Long | ValueType::Double => true,
_ => false, _ => false,
} }
} }
@ -440,14 +440,14 @@ impl TypedValue {
pub fn value_type(&self) -> ValueType { pub fn value_type(&self) -> ValueType {
match self { match self {
&TypedValue::Ref(_) => ValueType::Ref, TypedValue::Ref(_) => ValueType::Ref,
&TypedValue::Boolean(_) => ValueType::Boolean, TypedValue::Boolean(_) => ValueType::Boolean,
&TypedValue::Long(_) => ValueType::Long, TypedValue::Long(_) => ValueType::Long,
&TypedValue::Instant(_) => ValueType::Instant, TypedValue::Instant(_) => ValueType::Instant,
&TypedValue::Double(_) => ValueType::Double, TypedValue::Double(_) => ValueType::Double,
&TypedValue::String(_) => ValueType::String, TypedValue::String(_) => ValueType::String,
&TypedValue::Keyword(_) => ValueType::Keyword, TypedValue::Keyword(_) => ValueType::Keyword,
&TypedValue::Uuid(_) => ValueType::Uuid, TypedValue::Uuid(_) => ValueType::Uuid,
} }
} }
@ -770,21 +770,21 @@ impl Binding {
pub fn as_scalar(&self) -> Option<&TypedValue> { pub fn as_scalar(&self) -> Option<&TypedValue> {
match self { match self {
&Binding::Scalar(ref v) => Some(v), Binding::Scalar(ref v) => Some(v),
_ => None, _ => None,
} }
} }
pub fn as_vec(&self) -> Option<&Vec<Binding>> { pub fn as_vec(&self) -> Option<&Vec<Binding>> {
match self { match self {
&Binding::Vec(ref v) => Some(v), Binding::Vec(ref v) => Some(v),
_ => None, _ => None,
} }
} }
pub fn as_map(&self) -> Option<&StructuredMap> { pub fn as_map(&self) -> Option<&StructuredMap> {
match self { match self {
&Binding::Map(ref v) => Some(v), Binding::Map(ref v) => Some(v),
_ => None, _ => None,
} }
} }
@ -856,10 +856,10 @@ impl Binding {
pub fn value_type(&self) -> Option<ValueType> { pub fn value_type(&self) -> Option<ValueType> {
match self { match self {
&Binding::Scalar(ref v) => Some(v.value_type()), Binding::Scalar(ref v) => Some(v.value_type()),
&Binding::Map(_) => None, Binding::Map(_) => None,
&Binding::Vec(_) => None, Binding::Vec(_) => None,
} }
} }
} }
@ -970,56 +970,56 @@ impl Binding {
pub fn as_entid(&self) -> Option<&Entid> { pub fn as_entid(&self) -> Option<&Entid> {
match self { match self {
&Binding::Scalar(TypedValue::Ref(ref v)) => Some(v), Binding::Scalar(TypedValue::Ref(ref v)) => Some(v),
_ => None, _ => None,
} }
} }
pub fn as_kw(&self) -> Option<&ValueRc<Keyword>> { pub fn as_kw(&self) -> Option<&ValueRc<Keyword>> {
match self { match self {
&Binding::Scalar(TypedValue::Keyword(ref v)) => Some(v), Binding::Scalar(TypedValue::Keyword(ref v)) => Some(v),
_ => None, _ => None,
} }
} }
pub fn as_boolean(&self) -> Option<&bool> { pub fn as_boolean(&self) -> Option<&bool> {
match self { match self {
&Binding::Scalar(TypedValue::Boolean(ref v)) => Some(v), Binding::Scalar(TypedValue::Boolean(ref v)) => Some(v),
_ => None, _ => None,
} }
} }
pub fn as_long(&self) -> Option<&i64> { pub fn as_long(&self) -> Option<&i64> {
match self { match self {
&Binding::Scalar(TypedValue::Long(ref v)) => Some(v), Binding::Scalar(TypedValue::Long(ref v)) => Some(v),
_ => None, _ => None,
} }
} }
pub fn as_double(&self) -> Option<&f64> { pub fn as_double(&self) -> Option<&f64> {
match self { match self {
&Binding::Scalar(TypedValue::Double(ref v)) => Some(&v.0), Binding::Scalar(TypedValue::Double(ref v)) => Some(&v.0),
_ => None, _ => None,
} }
} }
pub fn as_instant(&self) -> Option<&DateTime<Utc>> { pub fn as_instant(&self) -> Option<&DateTime<Utc>> {
match self { match self {
&Binding::Scalar(TypedValue::Instant(ref v)) => Some(v), Binding::Scalar(TypedValue::Instant(ref v)) => Some(v),
_ => None, _ => None,
} }
} }
pub fn as_string(&self) -> Option<&ValueRc<String>> { pub fn as_string(&self) -> Option<&ValueRc<String>> {
match self { match self {
&Binding::Scalar(TypedValue::String(ref v)) => Some(v), Binding::Scalar(TypedValue::String(ref v)) => Some(v),
_ => None, _ => None,
} }
} }
pub fn as_uuid(&self) -> Option<&Uuid> { pub fn as_uuid(&self) -> Option<&Uuid> {
match self { match self {
&Binding::Scalar(TypedValue::Uuid(ref v)) => Some(v), Binding::Scalar(TypedValue::Uuid(ref v)) => Some(v),
_ => None, _ => None,
} }
} }

View file

@ -92,53 +92,53 @@ impl ValueTypeSet {
self.0.insert(vt) self.0.insert(vt)
} }
pub fn len(&self) -> usize { pub fn len(self) -> usize {
self.0.len() self.0.len()
} }
/// Returns a set containing all the types in this set and `other`. /// Returns a set containing all the types in this set and `other`.
pub fn union(&self, other: &ValueTypeSet) -> ValueTypeSet { pub fn union(self, other: ValueTypeSet) -> ValueTypeSet {
ValueTypeSet(self.0.union(other.0)) ValueTypeSet(self.0.union(other.0))
} }
pub fn intersection(&self, other: &ValueTypeSet) -> ValueTypeSet { pub fn intersection(self, other: ValueTypeSet) -> ValueTypeSet {
ValueTypeSet(self.0.intersection(other.0)) ValueTypeSet(self.0.intersection(other.0))
} }
/// Returns the set difference between `self` and `other`, which is the /// Returns the set difference between `self` and `other`, which is the
/// set of items in `self` that are not in `other`. /// set of items in `self` that are not in `other`.
pub fn difference(&self, other: &ValueTypeSet) -> ValueTypeSet { pub fn difference(self, other: ValueTypeSet) -> ValueTypeSet {
ValueTypeSet(self.0 - other.0) ValueTypeSet(self.0 - other.0)
} }
/// Return an arbitrary type that's part of this set. /// Return an arbitrary type that's part of this set.
/// For a set containing a single type, this will be that type. /// For a set containing a single type, this will be that type.
pub fn exemplar(&self) -> Option<ValueType> { pub fn exemplar(self) -> Option<ValueType> {
self.0.iter().next() self.0.iter().next()
} }
pub fn is_subset(&self, other: &ValueTypeSet) -> bool { pub fn is_subset(self, other: ValueTypeSet) -> bool {
self.0.is_subset(&other.0) self.0.is_subset(&other.0)
} }
/// Returns true if `self` and `other` contain no items in common. /// Returns true if `self` and `other` contain no items in common.
pub fn is_disjoint(&self, other: &ValueTypeSet) -> bool { pub fn is_disjoint(self, other: ValueTypeSet) -> bool {
self.0.is_disjoint(&other.0) self.0.is_disjoint(&other.0)
} }
pub fn contains(&self, vt: ValueType) -> bool { pub fn contains(self, vt: ValueType) -> bool {
self.0.contains(&vt) self.0.contains(&vt)
} }
pub fn is_empty(&self) -> bool { pub fn is_empty(self) -> bool {
self.0.is_empty() self.0.is_empty()
} }
pub fn is_unit(&self) -> bool { pub fn is_unit(self) -> bool {
self.0.len() == 1 self.0.len() == 1
} }
pub fn iter(&self) -> ::enum_set::Iter<ValueType> { pub fn iter(self) -> ::enum_set::Iter<ValueType> {
self.0.iter() self.0.iter()
} }
} }
@ -150,8 +150,8 @@ impl From<ValueType> for ValueTypeSet {
} }
impl ValueTypeSet { impl ValueTypeSet {
pub fn is_only_numeric(&self) -> bool { pub fn is_only_numeric(self) -> bool {
self.is_subset(&ValueTypeSet::of_numeric_types()) self.is_subset(ValueTypeSet::of_numeric_types())
} }
} }

View file

@ -19,11 +19,11 @@ use edn::types::Value;
/// Declare a lazy static `ident` of type `Value::Keyword` with the given `namespace` and /// Declare a lazy static `ident` of type `Value::Keyword` with the given `namespace` and
/// `name`. /// `name`.
/// ///
/// It may look surprising that we declare a new `lazy_static!` block rather than including /// It may look surprising to declare a new `lazy_static!` block rather than including
/// invocations inside an existing `lazy_static!` block. The latter cannot be done, since macros /// invocations inside an existing `lazy_static!` block. The latter cannot be done, since macros
/// are expanded outside-in. Looking at the `lazy_static!` source suggests that there is no harm in /// will be expanded outside-in. Looking at the `lazy_static!` source suggests that there is no
/// repeating that macro, since internally a multi-`static` block is expanded into many /// harm in repeating that macro, since internally a multi-`static` block will be expanded into
/// single-`static` blocks. /// many single-`static` blocks.
/// ///
/// TODO: take just ":db.part/db" and define DB_PART_DB using "db.part" and "db". /// TODO: take just ":db.part/db" and define DB_PART_DB using "db.part" and "db".
macro_rules! lazy_static_namespaced_keyword_value ( macro_rules! lazy_static_namespaced_keyword_value (

View file

@ -11,7 +11,7 @@
use std::cell::Cell; use std::cell::Cell;
use std::rc::Rc; use std::rc::Rc;
#[derive(Clone)] #[derive(Clone, Default)]
pub struct RcCounter { pub struct RcCounter {
c: Rc<Cell<usize>>, c: Rc<Cell<usize>>,
} }

View file

@ -135,7 +135,7 @@ impl Schema {
} }
fn get_raw_entid(&self, x: &Keyword) -> Option<Entid> { fn get_raw_entid(&self, x: &Keyword) -> Option<Entid> {
self.ident_map.get(x).map(|x| *x) self.ident_map.get(x).copied()
} }
pub fn update_component_attributes(&mut self) { pub fn update_component_attributes(&mut self) {

View file

@ -69,7 +69,7 @@ impl ::std::fmt::Display for SchemaConstraintViolation {
fn fmt(&self, f: &mut ::std::fmt::Formatter) -> ::std::fmt::Result { fn fmt(&self, f: &mut ::std::fmt::Formatter) -> ::std::fmt::Result {
use self::SchemaConstraintViolation::*; use self::SchemaConstraintViolation::*;
match self { match self {
&ConflictingUpserts { ConflictingUpserts {
ref conflicting_upserts, ref conflicting_upserts,
} => { } => {
writeln!(f, "conflicting upserts:")?; writeln!(f, "conflicting upserts:")?;
@ -78,7 +78,7 @@ impl ::std::fmt::Display for SchemaConstraintViolation {
} }
Ok(()) Ok(())
} }
&TypeDisagreements { TypeDisagreements {
ref conflicting_datoms, ref conflicting_datoms,
} => { } => {
writeln!(f, "type disagreements:")?; writeln!(f, "type disagreements:")?;
@ -91,9 +91,9 @@ impl ::std::fmt::Display for SchemaConstraintViolation {
} }
Ok(()) Ok(())
} }
&CardinalityConflicts { ref conflicts } => { CardinalityConflicts { ref conflicts } => {
writeln!(f, "cardinality conflicts:")?; writeln!(f, "cardinality conflicts:")?;
for ref conflict in conflicts { for conflict in conflicts {
writeln!(f, " {:?}", conflict)?; writeln!(f, " {:?}", conflict)?;
} }
Ok(()) Ok(())
@ -116,10 +116,10 @@ impl ::std::fmt::Display for InputError {
fn fmt(&self, f: &mut ::std::fmt::Formatter) -> ::std::fmt::Result { fn fmt(&self, f: &mut ::std::fmt::Formatter) -> ::std::fmt::Result {
use self::InputError::*; use self::InputError::*;
match self { match self {
&BadDbId => { BadDbId => {
writeln!(f, ":db/id in map notation must either not be present or be an entid, an ident, or a tempid") writeln!(f, ":db/id in map notation must either not be present or be an entid, an ident, or a tempid")
}, },
&BadEntityPlace => { BadEntityPlace => {
writeln!(f, "cannot convert value place into entity place") writeln!(f, "cannot convert value place into entity place")
}, },
} }
@ -163,7 +163,7 @@ impl From<DbErrorKind> for DbError {
impl From<Context<DbErrorKind>> for DbError { impl From<Context<DbErrorKind>> for DbError {
fn from(inner: Context<DbErrorKind>) -> Self { fn from(inner: Context<DbErrorKind>) -> Self {
DbError { inner: inner } DbError { inner }
} }
} }

View file

@ -48,12 +48,10 @@ where
} else { } else {
self.asserted.insert(key, value); self.asserted.insert(key, value);
} }
} else if let Some(asserted_value) = self.asserted.remove(&key) {
self.altered.insert(key, (value, asserted_value));
} else { } else {
if let Some(asserted_value) = self.asserted.remove(&key) { self.retracted.insert(key, value);
self.altered.insert(key, (value, asserted_value));
} else {
self.retracted.insert(key, value);
}
} }
} }
} }

View file

@ -27,7 +27,7 @@ use types::{Partition, PartitionMap};
/// The first transaction ID applied to the knowledge base. /// The first transaction ID applied to the knowledge base.
/// ///
/// This is the start of the :db.part/tx partition. /// This is the start of the :db.part/tx partition.
pub const TX0: i64 = 0x10000000; pub const TX0: i64 = 0x1000_0000;
/// This is the start of the :db.part/user partition. /// This is the start of the :db.part/user partition.
pub const USER0: i64 = 0x10000; pub const USER0: i64 = 0x10000;
@ -206,14 +206,14 @@ lazy_static! {
/// Convert (ident, entid) pairs into [:db/add IDENT :db/ident IDENT] `Value` instances. /// Convert (ident, entid) pairs into [:db/add IDENT :db/ident IDENT] `Value` instances.
fn idents_to_assertions(idents: &[(symbols::Keyword, i64)]) -> Vec<Value> { fn idents_to_assertions(idents: &[(symbols::Keyword, i64)]) -> Vec<Value> {
idents idents
.into_iter() .iter()
.map(|&(ref ident, _)| { .map(|&(ref ident, _)| {
let value = Value::Keyword(ident.clone()); let value = Value::Keyword(ident.clone());
Value::Vector(vec![ Value::Vector(vec![
values::DB_ADD.clone(), values::DB_ADD.clone(),
value.clone(), value.clone(),
values::DB_IDENT.clone(), values::DB_IDENT.clone(),
value.clone(), value,
]) ])
}) })
.collect() .collect()
@ -225,7 +225,7 @@ fn schema_attrs_to_assertions(version: u32, idents: &[symbols::Keyword]) -> Vec<
let schema_attr = Value::Keyword(ns_keyword!("db.schema", "attribute")); let schema_attr = Value::Keyword(ns_keyword!("db.schema", "attribute"));
let schema_version = Value::Keyword(ns_keyword!("db.schema", "version")); let schema_version = Value::Keyword(ns_keyword!("db.schema", "version"));
idents idents
.into_iter() .iter()
.map(|ident| { .map(|ident| {
let value = Value::Keyword(ident.clone()); let value = Value::Keyword(ident.clone());
Value::Vector(vec![ Value::Vector(vec![
@ -260,7 +260,7 @@ fn symbolic_schema_to_triples(
Value::Map(ref m) => { Value::Map(ref m) => {
for (ident, mp) in m { for (ident, mp) in m {
let ident = match ident { let ident = match ident {
&Value::Keyword(ref ident) => ident, Value::Keyword(ref ident) => ident,
_ => bail!(DbErrorKind::BadBootstrapDefinition(format!( _ => bail!(DbErrorKind::BadBootstrapDefinition(format!(
"Expected namespaced keyword for ident but got '{:?}'", "Expected namespaced keyword for ident but got '{:?}'",
ident ident
@ -270,7 +270,7 @@ fn symbolic_schema_to_triples(
Value::Map(ref mpp) => { Value::Map(ref mpp) => {
for (attr, value) in mpp { for (attr, value) in mpp {
let attr = match attr { let attr = match attr {
&Value::Keyword(ref attr) => attr, Value::Keyword(ref attr) => attr,
_ => bail!(DbErrorKind::BadBootstrapDefinition(format!( _ => bail!(DbErrorKind::BadBootstrapDefinition(format!(
"Expected namespaced keyword for attr but got '{:?}'", "Expected namespaced keyword for attr but got '{:?}'",
attr attr
@ -289,7 +289,7 @@ fn symbolic_schema_to_triples(
Some(TypedValue::Keyword(ref k)) => ident_map Some(TypedValue::Keyword(ref k)) => ident_map
.get(k) .get(k)
.map(|entid| TypedValue::Ref(*entid)) .map(|entid| TypedValue::Ref(*entid))
.ok_or(DbErrorKind::UnrecognizedIdent(k.to_string()))?, .ok_or_else(|| DbErrorKind::UnrecognizedIdent(k.to_string()))?,
Some(v) => v, Some(v) => v,
_ => bail!(DbErrorKind::BadBootstrapDefinition(format!( _ => bail!(DbErrorKind::BadBootstrapDefinition(format!(
"Expected Mentat typed value for value but got '{:?}'", "Expected Mentat typed value for value but got '{:?}'",
@ -377,8 +377,6 @@ pub(crate) fn bootstrap_entities() -> Vec<Entity<edn::ValueAndSpan>> {
); );
// Failure here is a coding error (since the inputs are fixed), not a runtime error. // Failure here is a coding error (since the inputs are fixed), not a runtime error.
// TODO: represent these bootstrap data errors rather than just panicing. // TODO: represent these bootstrap entity data errors rather than just panicing.
let bootstrap_entities: Vec<Entity<edn::ValueAndSpan>> = edn::parse::entities(&bootstrap_assertions.to_string()).expect("bootstrap assertions")
edn::parse::entities(&bootstrap_assertions.to_string()).expect("bootstrap assertions");
return bootstrap_entities;
} }

View file

@ -54,8 +54,6 @@ use std::collections::btree_map::Entry::{Occupied, Vacant};
use std::iter::once; use std::iter::once;
use std::mem;
use std::sync::Arc; use std::sync::Arc;
use std::iter::Peekable; use std::iter::Peekable;
@ -190,7 +188,7 @@ impl AevFactory {
return existing; return existing;
} }
self.strings.insert(rc.clone()); self.strings.insert(rc.clone());
return TypedValue::String(rc); TypedValue::String(rc)
} }
t => t, t => t,
} }
@ -377,7 +375,7 @@ impl RemoveFromCache for MultiValAttributeCache {
impl CardinalityManyCache for MultiValAttributeCache { impl CardinalityManyCache for MultiValAttributeCache {
fn acc(&mut self, e: Entid, v: TypedValue) { fn acc(&mut self, e: Entid, v: TypedValue) {
self.e_vs.entry(e).or_insert(vec![]).push(v) self.e_vs.entry(e).or_insert_with(|| vec![]).push(v)
} }
fn set(&mut self, e: Entid, vs: Vec<TypedValue>) { fn set(&mut self, e: Entid, vs: Vec<TypedValue>) {
@ -439,7 +437,7 @@ impl UniqueReverseAttributeCache {
} }
fn get_e(&self, v: &TypedValue) -> Option<Entid> { fn get_e(&self, v: &TypedValue) -> Option<Entid> {
self.v_e.get(v).and_then(|o| o.clone()) self.v_e.get(v).and_then(|o| *o)
} }
fn lookup(&self, v: &TypedValue) -> Option<Option<Entid>> { fn lookup(&self, v: &TypedValue) -> Option<Option<Entid>> {
@ -494,7 +492,7 @@ impl RemoveFromCache for NonUniqueReverseAttributeCache {
impl NonUniqueReverseAttributeCache { impl NonUniqueReverseAttributeCache {
fn acc(&mut self, e: Entid, v: TypedValue) { fn acc(&mut self, e: Entid, v: TypedValue) {
self.v_es.entry(v).or_insert(BTreeSet::new()).insert(e); self.v_es.entry(v).or_insert_with(BTreeSet::new).insert(e);
} }
fn get_es(&self, v: &TypedValue) -> Option<&BTreeSet<Entid>> { fn get_es(&self, v: &TypedValue) -> Option<&BTreeSet<Entid>> {
@ -643,9 +641,9 @@ enum AccumulationBehavior {
} }
impl AccumulationBehavior { impl AccumulationBehavior {
fn is_replacing(&self) -> bool { fn is_replacing(self) -> bool {
match self { match self {
&AccumulationBehavior::Add { replacing } => replacing, AccumulationBehavior::Add { replacing } => replacing,
_ => false, _ => false,
} }
} }
@ -662,7 +660,7 @@ pub struct AttributeCaches {
non_unique_reverse: BTreeMap<Entid, NonUniqueReverseAttributeCache>, non_unique_reverse: BTreeMap<Entid, NonUniqueReverseAttributeCache>,
} }
// TODO: if an entity or attribute is ever renumbered, the cache will need to be rebuilt. // TODO: if an entity or attribute is ever re-numbered, the cache will need to be rebuilt.
impl AttributeCaches { impl AttributeCaches {
// //
// These function names are brief and local. // These function names are brief and local.
@ -1006,7 +1004,7 @@ impl AttributeCaches {
} }
} }
// We need this block for fallback. // We need this block for fall-back.
impl AttributeCaches { impl AttributeCaches {
fn get_entid_for_value_if_present( fn get_entid_for_value_if_present(
&self, &self,
@ -1076,7 +1074,7 @@ impl AttributeCaches {
) -> Result<()> { ) -> Result<()> {
let mut aev_factory = AevFactory::new(); let mut aev_factory = AevFactory::new();
let rows = statement.query_map(&args, |row| Ok(aev_factory.row_to_aev(row)))?; let rows = statement.query_map(&args, |row| Ok(aev_factory.row_to_aev(row)))?;
let aevs = AevRows { rows: rows }; let aevs = AevRows { rows };
self.accumulate_into_cache( self.accumulate_into_cache(
None, None,
schema, schema,
@ -1132,7 +1130,7 @@ impl AttributeCaches {
schema: &'s Schema, schema: &'s Schema,
sqlite: &'c rusqlite::Connection, sqlite: &'c rusqlite::Connection,
attrs: AttributeSpec, attrs: AttributeSpec,
entities: &Vec<Entid>, entities: &[Entid],
) -> Result<()> { ) -> Result<()> {
// Mark the attributes as cached as we go. We do this because we're going in through the // Mark the attributes as cached as we go. We do this because we're going in through the
// back door here, and the usual caching API won't have taken care of this for us. // back door here, and the usual caching API won't have taken care of this for us.
@ -1229,17 +1227,17 @@ impl AttributeCaches {
schema: &'s Schema, schema: &'s Schema,
sqlite: &'c rusqlite::Connection, sqlite: &'c rusqlite::Connection,
mut attrs: AttributeSpec, mut attrs: AttributeSpec,
entities: &Vec<Entid>, entities: &[Entid],
) -> Result<()> { ) -> Result<()> {
// TODO: Exclude any entities for which every attribute is known. // TODO: Exclude any entities for which every attribute is known.
// TODO: initialize from an existing (complete) AttributeCache. // TODO: initialize from an existing (complete) AttributeCache.
// Exclude any attributes for which every entity's value is already known. // Exclude any attributes for which every entity's value is already known.
match &mut attrs { match &mut attrs {
&mut AttributeSpec::All => { AttributeSpec::All => {
// If we're caching all attributes, there's nothing we can exclude. // If we're caching all attributes, there's nothing we can exclude.
} }
&mut AttributeSpec::Specified { AttributeSpec::Specified {
ref mut non_fts, ref mut non_fts,
ref mut fts, ref mut fts,
} => { } => {
@ -1285,7 +1283,7 @@ impl AttributeCaches {
schema: &'s Schema, schema: &'s Schema,
sqlite: &'c rusqlite::Connection, sqlite: &'c rusqlite::Connection,
attrs: AttributeSpec, attrs: AttributeSpec,
entities: &Vec<Entid>, entities: &[Entid],
) -> Result<AttributeCaches> { ) -> Result<AttributeCaches> {
let mut cache = AttributeCaches::default(); let mut cache = AttributeCaches::default();
cache.populate_cache_for_entities_and_attributes(schema, sqlite, attrs, entities)?; cache.populate_cache_for_entities_and_attributes(schema, sqlite, attrs, entities)?;
@ -1450,7 +1448,7 @@ pub struct SQLiteAttributeCache {
} }
impl SQLiteAttributeCache { impl SQLiteAttributeCache {
fn make_mut<'s>(&'s mut self) -> &'s mut AttributeCaches { fn make_mut(&mut self) -> &mut AttributeCaches {
Arc::make_mut(&mut self.inner) Arc::make_mut(&mut self.inner)
} }
@ -1628,7 +1626,7 @@ impl InProgressSQLiteAttributeCache {
let overlay = inner.make_override(); let overlay = inner.make_override();
InProgressSQLiteAttributeCache { InProgressSQLiteAttributeCache {
inner: inner.inner, inner: inner.inner,
overlay: overlay, overlay,
unregistered_forward: Default::default(), unregistered_forward: Default::default(),
unregistered_reverse: Default::default(), unregistered_reverse: Default::default(),
} }
@ -1818,9 +1816,7 @@ impl CachedAttributes for InProgressSQLiteAttributeCache {
.inner .inner
.forward_cached_attributes .forward_cached_attributes
.iter() .iter()
.filter(|a| !self.unregistered_forward.contains(a)) .any(|a| !self.unregistered_forward.contains(a))
.next()
.is_some()
{ {
return true; return true;
} }
@ -1828,9 +1824,7 @@ impl CachedAttributes for InProgressSQLiteAttributeCache {
self.inner self.inner
.reverse_cached_attributes .reverse_cached_attributes
.iter() .iter()
.filter(|a| !self.unregistered_reverse.contains(a)) .any(|a| !self.unregistered_reverse.contains(a))
.next()
.is_some()
} }
fn get_entids_for_value( fn get_entids_for_value(
@ -1944,7 +1938,7 @@ impl<'a> InProgressCacheTransactWatcher<'a> {
let mut w = InProgressCacheTransactWatcher { let mut w = InProgressCacheTransactWatcher {
collected_assertions: Default::default(), collected_assertions: Default::default(),
collected_retractions: Default::default(), collected_retractions: Default::default(),
cache: cache, cache,
active: true, active: true,
}; };
@ -1977,10 +1971,10 @@ impl<'a> TransactWatcher for InProgressCacheTransactWatcher<'a> {
} }
Entry::Occupied(mut entry) => { Entry::Occupied(mut entry) => {
match entry.get_mut() { match entry.get_mut() {
&mut Either::Left(_) => { Either::Left(_) => {
// Nothing to do. // Nothing to do.
} }
&mut Either::Right(ref mut vec) => { Either::Right(ref mut vec) => {
vec.push((e, v.clone())); vec.push((e, v.clone()));
} }
} }
@ -1989,14 +1983,12 @@ impl<'a> TransactWatcher for InProgressCacheTransactWatcher<'a> {
} }
fn done(&mut self, _t: &Entid, schema: &Schema) -> Result<()> { fn done(&mut self, _t: &Entid, schema: &Schema) -> Result<()> {
// Oh, I wish we had impl trait. Without it we have a six-line type signature if we // Oh, how I wish we had `impl trait`. Without it we have a six-line type signature if we
// try to break this out as a helper function. // try to break this out as a helper function.
let collected_retractions = let collected_retractions = std::mem::take(&mut self.collected_retractions);
mem::replace(&mut self.collected_retractions, Default::default()); let collected_assertions = std::mem::take(&mut self.collected_assertions);
let collected_assertions = mem::replace(&mut self.collected_assertions, Default::default());
let mut intermediate_expansion = once(collected_retractions) let mut intermediate_expansion = once(collected_retractions)
.chain(once(collected_assertions)) .chain(once(collected_assertions))
.into_iter()
.map(move |tree| { .map(move |tree| {
tree.into_iter() tree.into_iter()
.filter_map(move |(a, evs)| { .filter_map(move |(a, evs)| {
@ -2018,7 +2010,7 @@ impl<'a> TransactWatcher for InProgressCacheTransactWatcher<'a> {
} }
impl InProgressSQLiteAttributeCache { impl InProgressSQLiteAttributeCache {
pub fn transact_watcher<'a>(&'a mut self) -> InProgressCacheTransactWatcher<'a> { pub fn transact_watcher(&mut self) -> InProgressCacheTransactWatcher {
InProgressCacheTransactWatcher::new(self) InProgressCacheTransactWatcher::new(self)
} }
} }

View file

@ -66,10 +66,9 @@ fn make_connection(
let page_size = 32768; let page_size = 32768;
let initial_pragmas = if let Some(encryption_key) = maybe_encryption_key { let initial_pragmas = if let Some(encryption_key) = maybe_encryption_key {
assert!( if !cfg!(feature = "sqlcipher") {
cfg!(feature = "sqlcipher"), panic!("This function shouldn't be called with a key unless we have sqlcipher support");
"This function shouldn't be called with a key unless we have sqlcipher support" }
);
// Important: The `cipher_page_size` cannot be changed without breaking // Important: The `cipher_page_size` cannot be changed without breaking
// the ability to open databases that were written when using a // the ability to open databases that were written when using a
// different `cipher_page_size`. Additionally, it (AFAICT) must be a // different `cipher_page_size`. Additionally, it (AFAICT) must be a
@ -147,10 +146,10 @@ pub const CURRENT_VERSION: i32 = 1;
/// MIN_SQLITE_VERSION should be changed when there's a new minimum version of sqlite required /// MIN_SQLITE_VERSION should be changed when there's a new minimum version of sqlite required
/// for the project to work. /// for the project to work.
const MIN_SQLITE_VERSION: i32 = 3008000; const MIN_SQLITE_VERSION: i32 = 3_008_000;
const TRUE: &'static bool = &true; const TRUE: &bool = &true;
const FALSE: &'static bool = &false; const FALSE: &bool = &false;
/// Turn an owned bool into a static reference to a bool. /// Turn an owned bool into a static reference to a bool.
/// ///
@ -360,9 +359,10 @@ pub fn create_current_version(conn: &mut rusqlite::Connection) -> Result<DB> {
// TODO: validate metadata mutations that aren't schema related, like additional partitions. // TODO: validate metadata mutations that aren't schema related, like additional partitions.
if let Some(next_schema) = next_schema { if let Some(next_schema) = next_schema {
if next_schema != db.schema { if next_schema != db.schema {
bail!(DbErrorKind::NotYetImplemented(format!( bail!(DbErrorKind::NotYetImplemented(
"Initial bootstrap transaction did not produce expected bootstrap schema" "Initial bootstrap transaction did not produce expected bootstrap schema"
))); .to_string()
));
} }
} }
@ -396,7 +396,7 @@ pub trait TypedSQLValue {
value: rusqlite::types::Value, value: rusqlite::types::Value,
value_type_tag: i32, value_type_tag: i32,
) -> Result<TypedValue>; ) -> Result<TypedValue>;
fn to_sql_value_pair<'a>(&'a self) -> (ToSqlOutput<'a>, i32); fn to_sql_value_pair(&self) -> (ToSqlOutput, i32);
fn from_edn_value(value: &Value) -> Option<TypedValue>; fn from_edn_value(value: &Value) -> Option<TypedValue>;
fn to_edn_value_pair(&self) -> (Value, ValueType); fn to_edn_value_pair(&self) -> (Value, ValueType);
} }
@ -446,43 +446,43 @@ impl TypedSQLValue for TypedValue {
/// This function is deterministic. /// This function is deterministic.
fn from_edn_value(value: &Value) -> Option<TypedValue> { fn from_edn_value(value: &Value) -> Option<TypedValue> {
match value { match value {
&Value::Boolean(x) => Some(TypedValue::Boolean(x)), Value::Boolean(x) => Some(TypedValue::Boolean(*x)),
&Value::Instant(x) => Some(TypedValue::Instant(x)), Value::Instant(x) => Some(TypedValue::Instant(*x)),
&Value::Integer(x) => Some(TypedValue::Long(x)), Value::Integer(x) => Some(TypedValue::Long(*x)),
&Value::Uuid(x) => Some(TypedValue::Uuid(x)), Value::Uuid(x) => Some(TypedValue::Uuid(*x)),
&Value::Float(ref x) => Some(TypedValue::Double(x.clone())), Value::Float(ref x) => Some(TypedValue::Double(*x)),
&Value::Text(ref x) => Some(x.clone().into()), Value::Text(ref x) => Some(x.clone().into()),
&Value::Keyword(ref x) => Some(x.clone().into()), Value::Keyword(ref x) => Some(x.clone().into()),
_ => None, _ => None,
} }
} }
/// Return the corresponding SQLite `value` and `value_type_tag` pair. /// Return the corresponding SQLite `value` and `value_type_tag` pair.
fn to_sql_value_pair<'a>(&'a self) -> (ToSqlOutput<'a>, i32) { fn to_sql_value_pair(&self) -> (ToSqlOutput, i32) {
match self { match self {
&TypedValue::Ref(x) => (x.into(), 0), TypedValue::Ref(x) => ((*x).into(), 0),
&TypedValue::Boolean(x) => ((if x { 1 } else { 0 }).into(), 1), TypedValue::Boolean(x) => ((if *x { 1 } else { 0 }).into(), 1),
&TypedValue::Instant(x) => (x.to_micros().into(), 4), TypedValue::Instant(x) => (x.to_micros().into(), 4),
// SQLite distinguishes integral from decimal types, allowing long and double to share a tag. // SQLite distinguishes integral from decimal types, allowing long and double to share a tag.
&TypedValue::Long(x) => (x.into(), 5), TypedValue::Long(x) => ((*x).into(), 5),
&TypedValue::Double(x) => (x.into_inner().into(), 5), TypedValue::Double(x) => (x.into_inner().into(), 5),
&TypedValue::String(ref x) => (x.as_str().into(), 10), TypedValue::String(ref x) => (x.as_str().into(), 10),
&TypedValue::Uuid(ref u) => (u.as_bytes().to_vec().into(), 11), TypedValue::Uuid(ref u) => (u.as_bytes().to_vec().into(), 11),
&TypedValue::Keyword(ref x) => (x.to_string().into(), 13), TypedValue::Keyword(ref x) => (x.to_string().into(), 13),
} }
} }
/// Return the corresponding EDN `value` and `value_type` pair. /// Return the corresponding EDN `value` and `value_type` pair.
fn to_edn_value_pair(&self) -> (Value, ValueType) { fn to_edn_value_pair(&self) -> (Value, ValueType) {
match self { match self {
&TypedValue::Ref(x) => (Value::Integer(x), ValueType::Ref), TypedValue::Ref(x) => (Value::Integer(*x), ValueType::Ref),
&TypedValue::Boolean(x) => (Value::Boolean(x), ValueType::Boolean), TypedValue::Boolean(x) => (Value::Boolean(*x), ValueType::Boolean),
&TypedValue::Instant(x) => (Value::Instant(x), ValueType::Instant), TypedValue::Instant(x) => (Value::Instant(*x), ValueType::Instant),
&TypedValue::Long(x) => (Value::Integer(x), ValueType::Long), TypedValue::Long(x) => (Value::Integer(*x), ValueType::Long),
&TypedValue::Double(x) => (Value::Float(x), ValueType::Double), TypedValue::Double(x) => (Value::Float(*x), ValueType::Double),
&TypedValue::String(ref x) => (Value::Text(x.as_ref().clone()), ValueType::String), TypedValue::String(ref x) => (Value::Text(x.as_ref().clone()), ValueType::String),
&TypedValue::Uuid(ref u) => (Value::Uuid(u.clone()), ValueType::Uuid), TypedValue::Uuid(ref u) => (Value::Uuid(*u), ValueType::Uuid),
&TypedValue::Keyword(ref x) => (Value::Keyword(x.as_ref().clone()), ValueType::Keyword), TypedValue::Keyword(ref x) => (Value::Keyword(x.as_ref().clone()), ValueType::Keyword),
} }
} }
} }
@ -510,7 +510,7 @@ pub fn read_partition_map(conn: &rusqlite::Connection) -> Result<PartitionMap> {
// First part of the union sprinkles 'allow_excision' into the 'parts' view. // First part of the union sprinkles 'allow_excision' into the 'parts' view.
// Second part of the union takes care of partitions which are known // Second part of the union takes care of partitions which are known
// but don't have any transactions. // but don't have any transactions.
let mut stmt: rusqlite::Statement = conn.prepare( conn.prepare(
" "
SELECT SELECT
known_parts.part, known_parts.part,
@ -536,16 +536,14 @@ pub fn read_partition_map(conn: &rusqlite::Connection) -> Result<PartitionMap> {
known_parts known_parts
WHERE WHERE
part NOT IN (SELECT part FROM parts)", part NOT IN (SELECT part FROM parts)",
)?; )?
let m = stmt .query_and_then(rusqlite::params![], |row| -> Result<(String, Partition)> {
.query_and_then(rusqlite::params![], |row| -> Result<(String, Partition)> { Ok((
Ok(( row.get(0)?,
row.get(0)?, Partition::new(row.get(1)?, row.get(2)?, row.get(3)?, row.get(4)?),
Partition::new(row.get(1)?, row.get(2)?, row.get(3)?, row.get(4)?), ))
)) })?
})? .collect()
.collect();
m
} }
/// Read the ident map materialized view from the given SQL store. /// Read the ident map materialized view from the given SQL store.
@ -767,7 +765,7 @@ impl MentatStoring for rusqlite::Connection {
// //
// TODO: `collect` into a HashSet so that any (a, v) is resolved at most once. // TODO: `collect` into a HashSet so that any (a, v) is resolved at most once.
let max_vars = self.limit(Limit::SQLITE_LIMIT_VARIABLE_NUMBER) as usize; let max_vars = self.limit(Limit::SQLITE_LIMIT_VARIABLE_NUMBER) as usize;
let chunks: itertools::IntoChunks<_> = avs.into_iter().enumerate().chunks(max_vars / 4); let chunks: itertools::IntoChunks<_> = avs.iter().enumerate().chunks(max_vars / 4);
// We'd like to `flat_map` here, but it's not obvious how to `flat_map` across `Result`. // We'd like to `flat_map` here, but it's not obvious how to `flat_map` across `Result`.
// Alternatively, this is a `fold`, and it might be wise to express it as such. // Alternatively, this is a `fold`, and it might be wise to express it as such.
@ -900,9 +898,8 @@ impl MentatStoring for rusqlite::Connection {
let bindings_per_statement = 6; let bindings_per_statement = 6;
let max_vars = self.limit(Limit::SQLITE_LIMIT_VARIABLE_NUMBER) as usize; let max_vars = self.limit(Limit::SQLITE_LIMIT_VARIABLE_NUMBER) as usize;
let chunks: itertools::IntoChunks<_> = entities let chunks: itertools::IntoChunks<_> =
.into_iter() entities.iter().chunks(max_vars / bindings_per_statement);
.chunks(max_vars / bindings_per_statement);
// We'd like to flat_map here, but it's not obvious how to flat_map across Result. // We'd like to flat_map here, but it's not obvious how to flat_map across Result.
let results: Result<Vec<()>> = chunks.into_iter().map(|chunk| -> Result<()> { let results: Result<Vec<()>> = chunks.into_iter().map(|chunk| -> Result<()> {
@ -973,9 +970,8 @@ impl MentatStoring for rusqlite::Connection {
let mut outer_searchid = 2000; let mut outer_searchid = 2000;
let chunks: itertools::IntoChunks<_> = entities let chunks: itertools::IntoChunks<_> =
.into_iter() entities.iter().chunks(max_vars / bindings_per_statement);
.chunks(max_vars / bindings_per_statement);
// From string to (searchid, value_type_tag). // From string to (searchid, value_type_tag).
let mut seen: HashMap<ValueRc<String>, (i64, i32)> = HashMap::with_capacity(entities.len()); let mut seen: HashMap<ValueRc<String>, (i64, i32)> = HashMap::with_capacity(entities.len());
@ -996,7 +992,7 @@ impl MentatStoring for rusqlite::Connection {
u8 /* flags0 */, u8 /* flags0 */,
i64 /* searchid */)>> = chunk.map(|&(e, a, ref attribute, ref typed_value, added)| { i64 /* searchid */)>> = chunk.map(|&(e, a, ref attribute, ref typed_value, added)| {
match typed_value { match typed_value {
&TypedValue::String(ref rc) => { TypedValue::String(ref rc) => {
datom_count += 1; datom_count += 1;
let entry = seen.entry(rc.clone()); let entry = seen.entry(rc.clone());
match entry { match entry {
@ -1186,7 +1182,10 @@ pub fn update_metadata(
// TODO: use concat! to avoid creating String instances. // TODO: use concat! to avoid creating String instances.
if !metadata_report.idents_altered.is_empty() { if !metadata_report.idents_altered.is_empty() {
// Idents is the materialized view of the [entid :db/ident ident] slice of datoms. // Idents is the materialized view of the [entid :db/ident ident] slice of datoms.
conn.execute(format!("DELETE FROM idents").as_str(), rusqlite::params![])?; conn.execute(
"DELETE FROM idents".to_string().as_str(),
rusqlite::params![],
)?;
conn.execute( conn.execute(
format!( format!(
"INSERT INTO idents SELECT e, a, v, value_type_tag FROM datoms WHERE a IN {}", "INSERT INTO idents SELECT e, a, v, value_type_tag FROM datoms WHERE a IN {}",
@ -1208,7 +1207,10 @@ pub fn update_metadata(
|| !metadata_report.attributes_altered.is_empty() || !metadata_report.attributes_altered.is_empty()
|| !metadata_report.idents_altered.is_empty() || !metadata_report.idents_altered.is_empty()
{ {
conn.execute(format!("DELETE FROM schema").as_str(), rusqlite::params![])?; conn.execute(
"DELETE FROM schema".to_string().as_str(),
rusqlite::params![],
)?;
// NB: we're using :db/valueType as a placeholder for the entire schema-defining set. // NB: we're using :db/valueType as a placeholder for the entire schema-defining set.
let s = format!( let s = format!(
r#" r#"

View file

@ -117,7 +117,7 @@ impl Datom {
pub fn to_edn(&self) -> edn::Value { pub fn to_edn(&self) -> edn::Value {
let f = |entid: &EntidOrIdent| -> edn::Value { let f = |entid: &EntidOrIdent| -> edn::Value {
match *entid { match *entid {
EntidOrIdent::Entid(ref y) => edn::Value::Integer(y.clone()), EntidOrIdent::Entid(ref y) => edn::Value::Integer(*y),
EntidOrIdent::Ident(ref y) => edn::Value::Keyword(y.clone()), EntidOrIdent::Ident(ref y) => edn::Value::Keyword(y.clone()),
} }
}; };
@ -134,13 +134,13 @@ impl Datom {
impl Datoms { impl Datoms {
pub fn to_edn(&self) -> edn::Value { pub fn to_edn(&self) -> edn::Value {
edn::Value::Vector((&self.0).into_iter().map(|x| x.to_edn()).collect()) edn::Value::Vector((&self.0).iter().map(|x| x.to_edn()).collect())
} }
} }
impl Transactions { impl Transactions {
pub fn to_edn(&self) -> edn::Value { pub fn to_edn(&self) -> edn::Value {
edn::Value::Vector((&self.0).into_iter().map(|x| x.to_edn()).collect()) edn::Value::Vector((&self.0).iter().map(|x| x.to_edn()).collect())
} }
} }
@ -148,7 +148,7 @@ impl FulltextValues {
pub fn to_edn(&self) -> edn::Value { pub fn to_edn(&self) -> edn::Value {
edn::Value::Vector( edn::Value::Vector(
(&self.0) (&self.0)
.into_iter() .iter()
.map(|&(x, ref y)| { .map(|&(x, ref y)| {
edn::Value::Vector(vec![edn::Value::Integer(x), edn::Value::Text(y.clone())]) edn::Value::Vector(vec![edn::Value::Integer(x), edn::Value::Text(y.clone())])
}) })
@ -238,7 +238,7 @@ pub fn datoms_after<S: Borrow<Schema>>(
e: EntidOrIdent::Entid(e), e: EntidOrIdent::Entid(e),
a: to_entid(borrowed_schema, a), a: to_entid(borrowed_schema, a),
v: value, v: value,
tx: tx, tx,
added: None, added: None,
})) }))
})? })?
@ -286,7 +286,7 @@ pub fn transactions_after<S: Borrow<Schema>>(
e: EntidOrIdent::Entid(e), e: EntidOrIdent::Entid(e),
a: to_entid(borrowed_schema, a), a: to_entid(borrowed_schema, a),
v: value, v: value,
tx: tx, tx,
added: Some(added), added: Some(added),
}) })
})? })?
@ -332,12 +332,12 @@ pub fn dump_sql_query(
let mut stmt: rusqlite::Statement = conn.prepare(sql)?; let mut stmt: rusqlite::Statement = conn.prepare(sql)?;
let mut tw = TabWriter::new(Vec::new()).padding(2); let mut tw = TabWriter::new(Vec::new()).padding(2);
write!(&mut tw, "{}\n", sql).unwrap(); writeln!(&mut tw, "{}", sql).unwrap();
for column_name in stmt.column_names() { for column_name in stmt.column_names() {
write!(&mut tw, "{}\t", column_name).unwrap(); write!(&mut tw, "{}\t", column_name).unwrap();
} }
write!(&mut tw, "\n").unwrap(); writeln!(&mut tw).unwrap();
let r: Result<Vec<_>> = stmt let r: Result<Vec<_>> = stmt
.query_and_then(params, |row| { .query_and_then(params, |row| {
@ -345,7 +345,7 @@ pub fn dump_sql_query(
let value: rusqlite::types::Value = row.get(i)?; let value: rusqlite::types::Value = row.get(i)?;
write!(&mut tw, "{:?}\t", value).unwrap(); write!(&mut tw, "{:?}\t", value).unwrap();
} }
write!(&mut tw, "\n").unwrap(); writeln!(&mut tw).unwrap();
Ok(()) Ok(())
})? })?
.collect(); .collect();
@ -381,8 +381,9 @@ impl TestConn {
I: Borrow<str>, I: Borrow<str>,
{ {
// Failure to parse the transaction is a coding error, so we unwrap. // Failure to parse the transaction is a coding error, so we unwrap.
let entities = edn::parse::entities(transaction.borrow()) let entities = edn::parse::entities(transaction.borrow()).unwrap_or_else(|_| {
.expect(format!("to be able to parse {} into entities", transaction.borrow()).as_str()); panic!("to be able to parse {} into entities", transaction.borrow())
});
let details = { let details = {
// The block scopes the borrow of self.sqlite. // The block scopes the borrow of self.sqlite.

View file

@ -86,7 +86,7 @@ impl TransactableValue for ValueAndSpan {
.as_text() .as_text()
.cloned() .cloned()
.map(TempId::External) .map(TempId::External)
.map(|v| v.into()) .map(|v| v)
} }
} }
@ -117,7 +117,7 @@ impl TransactableValue for TypedValue {
fn as_tempid(&self) -> Option<TempId> { fn as_tempid(&self) -> Option<TempId> {
match self { match self {
&TypedValue::String(ref s) => Some(TempId::External((**s).clone()).into()), TypedValue::String(ref s) => Some(TempId::External((**s).clone())),
_ => None, _ => None,
} }
} }

View file

@ -95,7 +95,7 @@ pub fn to_namespaced_keyword(s: &str) -> Result<symbols::Keyword> {
_ => None, _ => None,
}; };
nsk.ok_or(DbErrorKind::NotYetImplemented(format!("InvalidKeyword: {}", s)).into()) nsk.ok_or_else(|| DbErrorKind::NotYetImplemented(format!("InvalidKeyword: {}", s)).into())
} }
/// Prepare an SQL `VALUES` block, like (?, ?, ?), (?, ?, ?). /// Prepare an SQL `VALUES` block, like (?, ?, ?), (?, ?, ?).

View file

@ -111,7 +111,7 @@ fn update_attribute_map_from_schema_retractions(
let mut eas = BTreeMap::new(); let mut eas = BTreeMap::new();
for (e, a, v) in retractions.into_iter() { for (e, a, v) in retractions.into_iter() {
if entids::is_a_schema_attribute(a) { if entids::is_a_schema_attribute(a) {
eas.entry(e).or_insert(vec![]).push(a); eas.entry(e).or_insert_with(|| vec![]).push(a);
suspect_retractions.push((e, a, v)); suspect_retractions.push((e, a, v));
} else { } else {
filtered_retractions.push((e, a, v)); filtered_retractions.push((e, a, v));
@ -145,7 +145,7 @@ fn update_attribute_map_from_schema_retractions(
// Remove attributes corresponding to retracted attribute. // Remove attributes corresponding to retracted attribute.
attribute_map.remove(&e); attribute_map.remove(&e);
} else { } else {
bail!(DbErrorKind::BadSchemaAssertion(format!("Retracting defining attributes of a schema without retracting its :db/ident is not permitted."))); bail!(DbErrorKind::BadSchemaAssertion("Retracting defining attributes of a schema without retracting its :db/ident is not permitted.".to_string()));
} }
} else { } else {
filtered_retractions.push((e, a, v)); filtered_retractions.push((e, a, v));
@ -172,7 +172,7 @@ pub fn update_attribute_map_from_entid_triples(
) -> AttributeBuilder { ) -> AttributeBuilder {
existing existing
.get(&attribute_id) .get(&attribute_id)
.map(AttributeBuilder::to_modify_attribute) .map(AttributeBuilder::modify_attribute)
.unwrap_or_else(AttributeBuilder::default) .unwrap_or_else(AttributeBuilder::default)
} }
@ -337,8 +337,8 @@ pub fn update_attribute_map_from_entid_triples(
} }
Ok(MetadataReport { Ok(MetadataReport {
attributes_installed: attributes_installed, attributes_installed,
attributes_altered: attributes_altered, attributes_altered,
idents_altered: BTreeMap::default(), idents_altered: BTreeMap::default(),
}) })
} }
@ -439,12 +439,12 @@ where
// component_attributes up-to-date: most of the time we'll rebuild it // component_attributes up-to-date: most of the time we'll rebuild it
// even though it's not necessary (e.g. a schema attribute that's _not_ // even though it's not necessary (e.g. a schema attribute that's _not_
// a component was removed, or a non-component related attribute changed). // a component was removed, or a non-component related attribute changed).
if report.attributes_did_change() || ident_set.retracted.len() > 0 { if report.attributes_did_change() || !ident_set.retracted.is_empty() {
schema.update_component_attributes(); schema.update_component_attributes();
} }
Ok(MetadataReport { Ok(MetadataReport {
idents_altered: idents_altered, idents_altered,
..report ..report
}) })
} }

View file

@ -77,7 +77,7 @@ fn validate_attribute_map(entid_map: &EntidMap, attribute_map: &AttributeMap) ->
entid_map entid_map
.get(entid) .get(entid)
.map(|ident| ident.to_string()) .map(|ident| ident.to_string())
.unwrap_or(entid.to_string()) .unwrap_or_else(|| entid.to_string())
}; };
attribute.validate(ident)?; attribute.validate(ident)?;
} }
@ -108,7 +108,7 @@ impl AttributeBuilder {
/// Make a new AttributeBuilder from an existing Attribute. This is important to allow /// Make a new AttributeBuilder from an existing Attribute. This is important to allow
/// retraction. Only attributes that we allow to change are duplicated here. /// retraction. Only attributes that we allow to change are duplicated here.
pub fn to_modify_attribute(attribute: &Attribute) -> Self { pub fn modify_attribute(attribute: &Attribute) -> Self {
let mut ab = AttributeBuilder::default(); let mut ab = AttributeBuilder::default();
ab.multival = Some(attribute.multival); ab.multival = Some(attribute.multival);
ab.unique = Some(attribute.unique); ab.unique = Some(attribute.unique);
@ -116,22 +116,22 @@ impl AttributeBuilder {
ab ab
} }
pub fn value_type<'a>(&'a mut self, value_type: ValueType) -> &'a mut Self { pub fn value_type(&mut self, value_type: ValueType) -> &mut Self {
self.value_type = Some(value_type); self.value_type = Some(value_type);
self self
} }
pub fn multival<'a>(&'a mut self, multival: bool) -> &'a mut Self { pub fn multival(&mut self, multival: bool) -> &mut Self {
self.multival = Some(multival); self.multival = Some(multival);
self self
} }
pub fn non_unique<'a>(&'a mut self) -> &'a mut Self { pub fn non_unique(&mut self) -> &mut Self {
self.unique = Some(None); self.unique = Some(None);
self self
} }
pub fn unique<'a>(&'a mut self, unique: attribute::Unique) -> &'a mut Self { pub fn unique(&mut self, unique: attribute::Unique) -> &mut Self {
if self.helpful && unique == attribute::Unique::Identity { if self.helpful && unique == attribute::Unique::Identity {
self.index = Some(true); self.index = Some(true);
} }
@ -139,12 +139,12 @@ impl AttributeBuilder {
self self
} }
pub fn index<'a>(&'a mut self, index: bool) -> &'a mut Self { pub fn index(&mut self, index: bool) -> &mut Self {
self.index = Some(index); self.index = Some(index);
self self
} }
pub fn fulltext<'a>(&'a mut self, fulltext: bool) -> &'a mut Self { pub fn fulltext(&mut self, fulltext: bool) -> &mut Self {
self.fulltext = Some(fulltext); self.fulltext = Some(fulltext);
if self.helpful && fulltext { if self.helpful && fulltext {
self.index = Some(true); self.index = Some(true);
@ -152,12 +152,12 @@ impl AttributeBuilder {
self self
} }
pub fn component<'a>(&'a mut self, component: bool) -> &'a mut Self { pub fn component(&mut self, component: bool) -> &mut Self {
self.component = Some(component); self.component = Some(component);
self self
} }
pub fn no_history<'a>(&'a mut self, no_history: bool) -> &'a mut Self { pub fn no_history(&mut self, no_history: bool) -> &mut Self {
self.no_history = Some(no_history); self.no_history = Some(no_history);
self self
} }
@ -197,7 +197,7 @@ impl AttributeBuilder {
attribute.multival = multival; attribute.multival = multival;
} }
if let Some(ref unique) = self.unique { if let Some(ref unique) = self.unique {
attribute.unique = unique.clone(); attribute.unique = *unique;
} }
if let Some(index) = self.index { if let Some(index) = self.index {
attribute.index = index; attribute.index = index;
@ -223,14 +223,12 @@ impl AttributeBuilder {
if let Some(ref unique) = self.unique { if let Some(ref unique) = self.unique {
if *unique != attribute.unique { if *unique != attribute.unique {
attribute.unique = unique.clone(); attribute.unique = *unique;
mutations.push(AttributeAlteration::Unique);
}
} else {
if attribute.unique != None {
attribute.unique = None;
mutations.push(AttributeAlteration::Unique); mutations.push(AttributeAlteration::Unique);
} }
} else if attribute.unique != None {
attribute.unique = None;
mutations.push(AttributeAlteration::Unique);
} }
if let Some(index) = self.index { if let Some(index) = self.index {
@ -272,17 +270,17 @@ pub trait SchemaBuilding {
impl SchemaBuilding for Schema { impl SchemaBuilding for Schema {
fn require_ident(&self, entid: Entid) -> Result<&symbols::Keyword> { fn require_ident(&self, entid: Entid) -> Result<&symbols::Keyword> {
self.get_ident(entid) self.get_ident(entid)
.ok_or(DbErrorKind::UnrecognizedEntid(entid).into()) .ok_or_else(|| DbErrorKind::UnrecognizedEntid(entid).into())
} }
fn require_entid(&self, ident: &symbols::Keyword) -> Result<KnownEntid> { fn require_entid(&self, ident: &symbols::Keyword) -> Result<KnownEntid> {
self.get_entid(&ident) self.get_entid(&ident)
.ok_or(DbErrorKind::UnrecognizedIdent(ident.to_string()).into()) .ok_or_else(|| DbErrorKind::UnrecognizedIdent(ident.to_string()).into())
} }
fn require_attribute_for_entid(&self, entid: Entid) -> Result<&Attribute> { fn require_attribute_for_entid(&self, entid: Entid) -> Result<&Attribute> {
self.attribute_for_entid(entid) self.attribute_for_entid(entid)
.ok_or(DbErrorKind::UnrecognizedEntid(entid).into()) .ok_or_else(|| DbErrorKind::UnrecognizedEntid(entid).into())
} }
/// Create a valid `Schema` from the constituent maps. /// Create a valid `Schema` from the constituent maps.
@ -290,10 +288,7 @@ impl SchemaBuilding for Schema {
ident_map: IdentMap, ident_map: IdentMap,
attribute_map: AttributeMap, attribute_map: AttributeMap,
) -> Result<Schema> { ) -> Result<Schema> {
let entid_map: EntidMap = ident_map let entid_map: EntidMap = ident_map.iter().map(|(k, v)| (*v, k.clone())).collect();
.iter()
.map(|(k, v)| (v.clone(), k.clone()))
.collect();
validate_attribute_map(&entid_map, &attribute_map)?; validate_attribute_map(&entid_map, &attribute_map)?;
Ok(Schema::new(ident_map, entid_map, attribute_map)) Ok(Schema::new(ident_map, entid_map, attribute_map))
@ -309,10 +304,10 @@ impl SchemaBuilding for Schema {
.map(|(symbolic_ident, symbolic_attr, value)| { .map(|(symbolic_ident, symbolic_attr, value)| {
let ident: i64 = *ident_map let ident: i64 = *ident_map
.get(&symbolic_ident) .get(&symbolic_ident)
.ok_or(DbErrorKind::UnrecognizedIdent(symbolic_ident.to_string()))?; .ok_or_else(|| DbErrorKind::UnrecognizedIdent(symbolic_ident.to_string()))?;
let attr: i64 = *ident_map let attr: i64 = *ident_map
.get(&symbolic_attr) .get(&symbolic_attr)
.ok_or(DbErrorKind::UnrecognizedIdent(symbolic_attr.to_string()))?; .ok_or_else(|| DbErrorKind::UnrecognizedIdent(symbolic_attr.to_string()))?;
Ok((ident, attr, value)) Ok((ident, attr, value))
}) })
.collect(); .collect();

View file

@ -58,7 +58,7 @@ fn collect_ordered_txs_to_move(
None => bail!(DbErrorKind::TimelinesInvalidRange), None => bail!(DbErrorKind::TimelinesInvalidRange),
}; };
while let Some(t) = rows.next() { for t in rows {
let t = t?; let t = t?;
txs.push(t.0); txs.push(t.0);
if t.1 != timeline { if t.1 != timeline {
@ -108,12 +108,13 @@ fn reversed_terms_for(
tx_id: Entid, tx_id: Entid,
) -> Result<Vec<TermWithoutTempIds>> { ) -> Result<Vec<TermWithoutTempIds>> {
let mut stmt = conn.prepare("SELECT e, a, v, value_type_tag, tx, added FROM timelined_transactions WHERE tx = ? AND timeline = ? ORDER BY tx DESC")?; let mut stmt = conn.prepare("SELECT e, a, v, value_type_tag, tx, added FROM timelined_transactions WHERE tx = ? AND timeline = ? ORDER BY tx DESC")?;
let mut rows = stmt.query_and_then( let rows = stmt.query_and_then(
&[&tx_id, &::TIMELINE_MAIN], &[&tx_id, &::TIMELINE_MAIN],
|row| -> Result<TermWithoutTempIds> { |row| -> Result<TermWithoutTempIds> {
let op = match row.get(5)? { let op = if row.get(5)? {
true => OpType::Retract, OpType::Retract
false => OpType::Add, } else {
OpType::Add
}; };
Ok(Term::AddOrRetract( Ok(Term::AddOrRetract(
op, op,
@ -126,7 +127,7 @@ fn reversed_terms_for(
let mut terms = vec![]; let mut terms = vec![];
while let Some(row) = rows.next() { for row in rows {
terms.push(row?); terms.push(row?);
} }
Ok(terms) Ok(terms)
@ -141,9 +142,9 @@ pub fn move_from_main_timeline(
new_timeline: Entid, new_timeline: Entid,
) -> Result<(Option<Schema>, PartitionMap)> { ) -> Result<(Option<Schema>, PartitionMap)> {
if new_timeline == ::TIMELINE_MAIN { if new_timeline == ::TIMELINE_MAIN {
bail!(DbErrorKind::NotYetImplemented(format!( bail!(DbErrorKind::NotYetImplemented(
"Can't move transactions to main timeline" "Can't move transactions to main timeline".to_string()
))); ));
} }
// We don't currently ensure that moving transactions onto a non-empty timeline // We don't currently ensure that moving transactions onto a non-empty timeline

View file

@ -163,12 +163,12 @@ where
tx_id: Entid, tx_id: Entid,
) -> Tx<'conn, 'a, W> { ) -> Tx<'conn, 'a, W> {
Tx { Tx {
store: store, store,
partition_map: partition_map, partition_map,
schema_for_mutation: Cow::Borrowed(schema_for_mutation), schema_for_mutation: Cow::Borrowed(schema_for_mutation),
schema: schema, schema,
watcher: watcher, watcher,
tx_id: tx_id, tx_id,
} }
} }
@ -185,8 +185,8 @@ where
// Map [a v]->entid. // Map [a v]->entid.
let mut av_pairs: Vec<&AVPair> = vec![]; let mut av_pairs: Vec<&AVPair> = vec![];
for i in 0..temp_id_avs.len() { for temp_id_av in temp_id_avs {
av_pairs.push(&temp_id_avs[i].1); av_pairs.push(&temp_id_av.1);
} }
// Lookup in the store. // Lookup in the store.
@ -208,14 +208,14 @@ where
av_map.get(&av_pair) av_map.get(&av_pair)
); );
if let Some(entid) = av_map.get(&av_pair).cloned().map(KnownEntid) { if let Some(entid) = av_map.get(&av_pair).cloned().map(KnownEntid) {
tempids.insert(tempid.clone(), entid).map(|previous| { if let Some(previous) = tempids.insert(tempid.clone(), entid) {
if entid != previous { if entid != previous {
conflicting_upserts conflicting_upserts
.entry((**tempid).clone()) .entry((**tempid).clone())
.or_insert_with(|| once(previous).collect::<BTreeSet<_>>()) .or_insert_with(|| once(previous).collect::<BTreeSet<_>>())
.insert(entid); .insert(entid);
} }
}); }
} }
} }
@ -340,7 +340,7 @@ where
entmod::EntityPlace::TxFunction(ref tx_function) => { entmod::EntityPlace::TxFunction(ref tx_function) => {
match tx_function.op.0.as_str() { match tx_function.op.0.as_str() {
"transaction-tx" => Ok(Either::Left(self.tx_id)), "transaction-tx" => Ok(Either::Left(self.tx_id)),
unknown @ _ => bail!(DbErrorKind::NotYetImplemented(format!( unknown => bail!(DbErrorKind::NotYetImplemented(format!(
"Unknown transaction function {}", "Unknown transaction function {}",
unknown unknown
))), ))),
@ -372,7 +372,7 @@ where
) -> Result<KnownEntidOr<LookupRefOrTempId>> { ) -> Result<KnownEntidOr<LookupRefOrTempId>> {
match backward_a.unreversed() { match backward_a.unreversed() {
None => { None => {
bail!(DbErrorKind::NotYetImplemented(format!("Cannot explode map notation value in :attr/_reversed notation for forward attribute"))); bail!(DbErrorKind::NotYetImplemented("Cannot explode map notation value in :attr/_reversed notation for forward attribute".to_string()));
} }
Some(forward_a) => { Some(forward_a) => {
let forward_a = self.entity_a_into_term_a(forward_a)?; let forward_a = self.entity_a_into_term_a(forward_a)?;
@ -412,7 +412,7 @@ where
entmod::ValuePlace::TxFunction(ref tx_function) => { entmod::ValuePlace::TxFunction(ref tx_function) => {
match tx_function.op.0.as_str() { match tx_function.op.0.as_str() {
"transaction-tx" => Ok(Either::Left(KnownEntid(self.tx_id.0))), "transaction-tx" => Ok(Either::Left(KnownEntid(self.tx_id.0))),
unknown @ _ => bail!(DbErrorKind::NotYetImplemented(format!("Unknown transaction function {}", unknown))), unknown=> bail!(DbErrorKind::NotYetImplemented(format!("Unknown transaction function {}", unknown))),
} }
}, },
@ -456,7 +456,7 @@ where
op: OpType::Add, op: OpType::Add,
e: db_id.clone(), e: db_id.clone(),
a: AttributePlace::Entid(a), a: AttributePlace::Entid(a),
v: v, v,
}); });
} }
} }
@ -519,7 +519,7 @@ where
entmod::ValuePlace::TxFunction(ref tx_function) => { entmod::ValuePlace::TxFunction(ref tx_function) => {
let typed_value = match tx_function.op.0.as_str() { let typed_value = match tx_function.op.0.as_str() {
"transaction-tx" => TypedValue::Ref(self.tx_id), "transaction-tx" => TypedValue::Ref(self.tx_id),
unknown @ _ => bail!(DbErrorKind::NotYetImplemented(format!( unknown => bail!(DbErrorKind::NotYetImplemented(format!(
"Unknown transaction function {}", "Unknown transaction function {}",
unknown unknown
))), ))),
@ -546,7 +546,7 @@ where
for vv in vs { for vv in vs {
deque.push_front(Entity::AddOrRetract { deque.push_front(Entity::AddOrRetract {
op: op.clone(), op,
e: e.clone(), e: e.clone(),
a: AttributePlace::Entid(entmod::EntidOrIdent::Entid(a)), a: AttributePlace::Entid(entmod::EntidOrIdent::Entid(a)),
v: vv, v: vv,
@ -667,8 +667,8 @@ where
|term: TermWithTempIdsAndLookupRefs| -> Result<TermWithTempIds> { |term: TermWithTempIdsAndLookupRefs| -> Result<TermWithTempIds> {
match term { match term {
Term::AddOrRetract(op, e, a, v) => { Term::AddOrRetract(op, e, a, v) => {
let e = replace_lookup_ref(&lookup_ref_map, e, |x| KnownEntid(x))?; let e = replace_lookup_ref(&lookup_ref_map, e, KnownEntid)?;
let v = replace_lookup_ref(&lookup_ref_map, v, |x| TypedValue::Ref(x))?; let v = replace_lookup_ref(&lookup_ref_map, v, TypedValue::Ref)?;
Ok(Term::AddOrRetract(op, e, a, v)) Ok(Term::AddOrRetract(op, e, a, v))
} }
} }
@ -757,14 +757,14 @@ where
for (tempid, entid) in temp_id_map { for (tempid, entid) in temp_id_map {
// Since `UpsertEV` instances always transition to `UpsertE` instances, it might be // Since `UpsertEV` instances always transition to `UpsertE` instances, it might be
// that a tempid resolves in two generations, and those resolutions might conflict. // that a tempid resolves in two generations, and those resolutions might conflict.
tempids.insert((*tempid).clone(), entid).map(|previous| { if let Some(previous) = tempids.insert((*tempid).clone(), entid) {
if entid != previous { if entid != previous {
conflicting_upserts conflicting_upserts
.entry((*tempid).clone()) .entry((*tempid).clone())
.or_insert_with(|| once(previous).collect::<BTreeSet<_>>()) .or_insert_with(|| once(previous).collect::<BTreeSet<_>>())
.insert(entid); .insert(entid);
} }
}); }
} }
if !conflicting_upserts.is_empty() { if !conflicting_upserts.is_empty() {
@ -891,10 +891,7 @@ where
.map(|v| (true, v)) .map(|v| (true, v))
.chain(ars.retract.into_iter().map(|v| (false, v))) .chain(ars.retract.into_iter().map(|v| (false, v)))
{ {
let op = match added { let op = if added { OpType::Add } else { OpType::Retract };
true => OpType::Add,
false => OpType::Retract,
};
self.watcher.datom(op, e, a, &v); self.watcher.datom(op, e, a, &v);
queue.push((e, a, attribute, v, added)); queue.push((e, a, attribute, v, added));
} }
@ -967,7 +964,7 @@ where
Ok(TxReport { Ok(TxReport {
tx_id: self.tx_id, tx_id: self.tx_id,
tx_instant, tx_instant,
tempids: tempids, tempids,
}) })
} }
} }
@ -1093,9 +1090,9 @@ where
let a_and_r = trie let a_and_r = trie
.entry((a, attribute)) .entry((a, attribute))
.or_insert(BTreeMap::default()) .or_insert_with(BTreeMap::default)
.entry(e) .entry(e)
.or_insert(AddAndRetract::default()); .or_insert_with(AddAndRetract::default);
match op { match op {
OpType::Add => a_and_r.add.insert(v), OpType::Add => a_and_r.add.insert(v),
@ -1136,9 +1133,9 @@ fn get_or_insert_tx_instant<'schema>(
entids::DB_TX_INSTANT, entids::DB_TX_INSTANT,
schema.require_attribute_for_entid(entids::DB_TX_INSTANT)?, schema.require_attribute_for_entid(entids::DB_TX_INSTANT)?,
)) ))
.or_insert(BTreeMap::default()) .or_insert_with(BTreeMap::default)
.entry(tx_id) .entry(tx_id)
.or_insert(AddAndRetract::default()); .or_insert_with(AddAndRetract::default);
if !ars.retract.is_empty() { if !ars.retract.is_empty() {
// Cannot retract :db/txInstant! // Cannot retract :db/txInstant!
} }

View file

@ -82,17 +82,18 @@ impl TxCommand {
impl Command for TxCommand { impl Command for TxCommand {
fn execute(&mut self) { fn execute(&mut self) {
self.observers.upgrade().map(|observers| { if let Some(observers) = self.observers.upgrade() {
for (key, observer) in observers.iter() { for (key, observer) in observers.iter() {
let applicable_reports = observer.applicable_reports(&self.reports); let applicable_reports = observer.applicable_reports(&self.reports);
if !applicable_reports.is_empty() { if !applicable_reports.is_empty() {
observer.notify(&key, applicable_reports); observer.notify(&key, applicable_reports);
} }
} }
}); }
} }
} }
#[derive(Default)]
pub struct TxObservationService { pub struct TxObservationService {
observers: Arc<IndexMap<String, Arc<TxObserver>>>, observers: Arc<IndexMap<String, Arc<TxObserver>>>,
executor: Option<Sender<Box<dyn Command + Send>>>, executor: Option<Sender<Box<dyn Command + Send>>>,
@ -107,7 +108,7 @@ impl TxObservationService {
} }
// For testing purposes // For testing purposes
pub fn is_registered(&self, key: &String) -> bool { pub fn is_registered(&self, key: &str) -> bool {
self.observers.contains_key(key) self.observers.contains_key(key)
} }
@ -115,7 +116,7 @@ impl TxObservationService {
Arc::make_mut(&mut self.observers).insert(key, observer); Arc::make_mut(&mut self.observers).insert(key, observer);
} }
pub fn deregister(&mut self, key: &String) { pub fn deregister(&mut self, key: &str) {
Arc::make_mut(&mut self.observers).remove(key); Arc::make_mut(&mut self.observers).remove(key);
} }
@ -154,6 +155,7 @@ impl Drop for TxObservationService {
} }
} }
#[derive(Default)]
pub struct InProgressObserverTransactWatcher { pub struct InProgressObserverTransactWatcher {
collected_attributes: AttributeSet, collected_attributes: AttributeSet,
pub txes: IndexMap<Entid, AttributeSet>, pub txes: IndexMap<Entid, AttributeSet>,
@ -174,8 +176,7 @@ impl TransactWatcher for InProgressObserverTransactWatcher {
} }
fn done(&mut self, t: &Entid, _schema: &Schema) -> Result<()> { fn done(&mut self, t: &Entid, _schema: &Schema) -> Result<()> {
let collected_attributes = let collected_attributes = ::std::mem::take(&mut self.collected_attributes);
::std::mem::replace(&mut self.collected_attributes, Default::default());
self.txes.insert(*t, collected_attributes); self.txes.insert(*t, collected_attributes);
Ok(()) Ok(())
} }

View file

@ -127,8 +127,8 @@ pub struct DB {
impl DB { impl DB {
pub fn new(partition_map: PartitionMap, schema: Schema) -> DB { pub fn new(partition_map: PartitionMap, schema: Schema) -> DB {
DB { DB {
partition_map: partition_map, partition_map,
schema: schema, schema,
} }
} }
} }

View file

@ -227,7 +227,7 @@ impl Generation {
} }
// Collect id->[a v] pairs that might upsert at this evolutionary step. // Collect id->[a v] pairs that might upsert at this evolutionary step.
pub(crate) fn temp_id_avs<'a>(&'a self) -> Vec<(TempIdHandle, AVPair)> { pub(crate) fn temp_id_avs(&self) -> Vec<(TempIdHandle, AVPair)> {
let mut temp_id_avs: Vec<(TempIdHandle, AVPair)> = vec![]; let mut temp_id_avs: Vec<(TempIdHandle, AVPair)> = vec![];
// TODO: map/collect. // TODO: map/collect.
for &UpsertE(ref t, ref a, ref v) in &self.upserts_e { for &UpsertE(ref t, ref a, ref v) in &self.upserts_e {
@ -269,32 +269,32 @@ impl Generation {
for term in self.allocations.iter() { for term in self.allocations.iter() {
match term { match term {
&Term::AddOrRetract(OpType::Add, Right(ref t1), a, Right(ref t2)) => { Term::AddOrRetract(OpType::Add, Right(ref t1), a, Right(ref t2)) => {
temp_ids.insert(t1.clone()); temp_ids.insert(t1.clone());
temp_ids.insert(t2.clone()); temp_ids.insert(t2.clone());
let attribute: &Attribute = schema.require_attribute_for_entid(a)?; let attribute: &Attribute = schema.require_attribute_for_entid(*a)?;
if attribute.unique == Some(attribute::Unique::Identity) { if attribute.unique == Some(attribute::Unique::Identity) {
tempid_avs tempid_avs
.entry((a, Right(t2.clone()))) .entry((*a, Right(t2.clone())))
.or_insert(vec![]) .or_insert_with(|| vec![])
.push(t1.clone()); .push(t1.clone());
} }
} }
&Term::AddOrRetract(OpType::Add, Right(ref t), a, ref x @ Left(_)) => { Term::AddOrRetract(OpType::Add, Right(ref t), a, ref x @ Left(_)) => {
temp_ids.insert(t.clone()); temp_ids.insert(t.clone());
let attribute: &Attribute = schema.require_attribute_for_entid(a)?; let attribute: &Attribute = schema.require_attribute_for_entid(*a)?;
if attribute.unique == Some(attribute::Unique::Identity) { if attribute.unique == Some(attribute::Unique::Identity) {
tempid_avs tempid_avs
.entry((a, x.clone())) .entry((*a, x.clone()))
.or_insert(vec![]) .or_insert_with(|| vec![])
.push(t.clone()); .push(t.clone());
} }
} }
&Term::AddOrRetract(OpType::Add, Left(_), _, Right(ref t)) => { Term::AddOrRetract(OpType::Add, Left(_), _, Right(ref t)) => {
temp_ids.insert(t.clone()); temp_ids.insert(t.clone());
} }
&Term::AddOrRetract(OpType::Add, Left(_), _, Left(_)) => unreachable!(), Term::AddOrRetract(OpType::Add, Left(_), _, Left(_)) => unreachable!(),
&Term::AddOrRetract(OpType::Retract, _, _, _) => { Term::AddOrRetract(OpType::Retract, _, _, _) => {
// [:db/retract ...] entities never allocate entids; they have to resolve due to // [:db/retract ...] entities never allocate entids; they have to resolve due to
// other upserts (or they fail the transaction). // other upserts (or they fail the transaction).
} }
@ -319,13 +319,11 @@ impl Generation {
); );
for vs in tempid_avs.values() { for vs in tempid_avs.values() {
vs.first() if let Some(&first_index) = vs.first().and_then(|first| temp_ids.get(first)) {
.and_then(|first| temp_ids.get(first)) for tempid in vs {
.map(|&first_index| { temp_ids.get(tempid).map(|&i| uf.union(first_index, i));
for tempid in vs { }
temp_ids.get(tempid).map(|&i| uf.union(first_index, i)); }
}
});
} }
debug!("union-find aggregation {:?}", uf.clone().into_labeling()); debug!("union-find aggregation {:?}", uf.clone().into_labeling());

177
docs/tutorial.md Normal file
View file

@ -0,0 +1,177 @@
# Introduction
Mentat is a transactional, relational storage system built on top of SQLite. The abstractions it offers allow you to easily tackle some things that are tricky in other storage systems:
- Have multiple components share storage and collaborate.
- Evolve schema.
- Track change over time.
- Synchronize data correctly.
- Store data with rich, checked types.
Mentat offers a programmatic Rust API for managing stores, retrieving data, and _transacting_ new data. It offers a Datalog-based query engine, with queries expressed in EDN, a rich textual data format similar to JSON. And it offers an EDN data format for transacting new data.
This tutorial covers all of these APIs, along with defining vocabulary.
We'll begin by introducing some concepts, and then we'll walk through some examples.
## What does Mentat store?
Depending on your perspective, Mentat looks like a relational store, a graph store, or a tuple store.
Mentat stores relationships between _entities_ and other entities or _values_. An entity is related to other things by an _attribute_.
All entities have an _entity ID_ (abbreviated to _entid_).
Some entities additionally have an identifier called an _ident_, which is a keyword. That looks something like `:bookmark/title`.
A value is a primitive piece of data. Mentat supports the following:
* Strings
* Long integers
* Double-precision floating point numbers
* Millisecond-precision timestamps
* UUIDs
* Booleans
* Keywords (a special kind of string that we use for idents).
There are two special kinds of entities: _attributes_ and _transactions_.
Attributes are themselves entities with a particular set of properties that define their meaning. They have identifiers, so you can refer to them easily. They have a _value type_, which is the type of value Mentat expects to be on the right hand side of the relationship. And they have a _cardinality_ (whether one or many values can exist for a particular entity), whether values are _unique_, a documentation string, and some indexing options.
An attribute looks something like this:
```edn
{:db/ident :bookmark/title
:db/cardinality :db.cardinality/one
:db/valueType :db.type/string
:db/fulltext true
:db/doc "The title of a bookmark."}
```
Transactions are special entities that can be described however you wish. By default they track the timestamp at which they were written.
The relationship between an entity, an attribute, and a value, occurring in a _transaction_ (which is just another kind of entity!) — a tuple of five values — is called a _datom_.
A single datom might look something like this:
```
[:db/add 65536 :bookmark/title "mozilla.org" 268435456]
^ ^ ^ ^ ^
\ Add or retract. | | | |
\ The entity. | | |
\ The attribute. | |
\ The value, a string. |
\ The transaction ID.
```
which is equivalent to saying "in transaction 268435456 we assert that entity 65536 is a bookmark with the title 'mozilla.org'".
When we transact that — which means to add it as a fact to the store — Mentat also describes the transaction itself on our behalf:
```edn
[:db/add 268435456 :db/txInstant "2018-01-25 20:07:04.408358 UTC" 268435456]
```
# A simple app
Let's get started with some Rust code.
First, the imports we'll need. The comments here briefly explain what each thing is.
```rust
// So you can define keywords with neater syntax.
#[macro_use(kw)]
extern crate mentat;
use mentat::{
Store, // A single database connection and in-memory metadata.
}
use mentat::vocabulary::attribute; // Properties of attributes.
```
## Defining a simple vocabulary
All data in Mentat — even the terms we used above, like `:db/cardinality` — are defined in the store itself. So that's where we start. In Rust, we define a _vocabulary definition_, and ask the store to ensure that it exists.
```rust
fn set_up(mut store: Store) -> Result<()> {
// Start a write transaction.
let mut in_progress = store.begin_transaction()?;
// Make sure the core vocabulary exists. This is good practice if a user,
// an external tool, or another component might have touched the file
// since you last wrote it.
in_progress.verify_core_schema()?;
// Make sure our vocabulary is installed, and install if necessary.
// This defines some attributes that we can use to describe people.
in_progress.ensure_vocabulary(&Definition {
name: kw!(:example/people),
version: 1,
attributes: vec![
(kw!(:person/name),
vocabulary::AttributeBuilder::default()
.value_type(ValueType::String)
.multival(true)
.build()),
(kw!(:person/age),
vocabulary::AttributeBuilder::default()
.value_type(ValueType::Long)
.multival(false)
.build()),
(kw!(:person/email),
vocabulary::AttributeBuilder::default()
.value_type(ValueType::String)
.multival(true)
.unique(attribute::Unique::Identity)
.build()),
],
})?;
in_progress.commit()?;
Ok(())
}
```
We open a store and configure its vocabulary like this:
```rust
let path = "/path/to/file.db";
let store = Store::open(path)?;
set_up(store)?;
```
If this code returns successfully, we're good to go.
## Transactions
You'll see in our `set_up` function that we begin and end a transaction, which we call `in_progress`. A read-only transaction is begun via `begin_read`. The resulting objects — `InProgress` and `InProgressRead` support various kinds of read and write operations. Transactions are automatically rolled back when dropped, so remember to call `commit`!
## Adding some data
There are two ways to add data to Mentat: programmatically or textually.
The textual form accepts EDN, a simple relative of JSON that supports richer types and more flexible syntax. You saw this in the introduction. Here's an example:
```rust
in_progress.transact(r#"[
{:person/name "Alice"
:person/age 32
:person/email "alice@example.org"}
]"#)?;
```
You can implicitly _upsert_ data when you have a unique attribute to use:
```rust
// Alice's age is now 33. Note that we don't need to find out an entid,
// nor explicitly INSERT OR REPLACE or UPDATE OR INSERT or similar.
in_progress.transact(r#"[
{:person/age 33
:person/email "alice@example.org"}
]"#)?;
```

View file

@ -49,8 +49,8 @@ impl TempId {
impl fmt::Display for TempId { impl fmt::Display for TempId {
fn fmt(&self, f: &mut fmt::Formatter) -> Result<(), fmt::Error> { fn fmt(&self, f: &mut fmt::Formatter) -> Result<(), fmt::Error> {
match self { match self {
&TempId::External(ref s) => write!(f, "{}", s), TempId::External(ref s) => write!(f, "{}", s),
&TempId::Internal(x) => write!(f, "<tempid {}>", x), TempId::Internal(x) => write!(f, "<tempid {}>", x),
} }
} }
} }
@ -76,8 +76,8 @@ impl From<Keyword> for EntidOrIdent {
impl EntidOrIdent { impl EntidOrIdent {
pub fn unreversed(&self) -> Option<EntidOrIdent> { pub fn unreversed(&self) -> Option<EntidOrIdent> {
match self { match self {
&EntidOrIdent::Entid(_) => None, EntidOrIdent::Entid(_) => None,
&EntidOrIdent::Ident(ref a) => a.unreversed().map(EntidOrIdent::Ident), EntidOrIdent::Ident(ref a) => a.unreversed().map(EntidOrIdent::Ident),
} }
} }
} }

View file

@ -12,8 +12,8 @@ extern crate chrono;
extern crate itertools; extern crate itertools;
extern crate num; extern crate num;
extern crate ordered_float; extern crate ordered_float;
extern crate pretty;
extern crate peg; extern crate peg;
extern crate pretty;
extern crate uuid; extern crate uuid;
#[cfg(feature = "serde_support")] #[cfg(feature = "serde_support")]
@ -50,13 +50,11 @@ pub use types::{
pub use symbols::{Keyword, NamespacedSymbol, PlainSymbol}; pub use symbols::{Keyword, NamespacedSymbol, PlainSymbol};
use std::collections::{BTreeSet, BTreeMap, LinkedList}; use std::collections::{BTreeMap, BTreeSet, LinkedList};
use std::f64::{INFINITY, NAN, NEG_INFINITY};
use std::iter::FromIterator; use std::iter::FromIterator;
use std::f64::{NAN, INFINITY, NEG_INFINITY};
use chrono::{ use chrono::TimeZone;
TimeZone,
};
use entities::*; use entities::*;
use query::FromValue; use query::FromValue;
@ -126,7 +124,7 @@ peg::parser!(pub grammar parse() for str {
// result = r#""foo\\bar""# // result = r#""foo\\bar""#
// For the typical case, string_normal_chars will match multiple, leading to a single-element vec. // For the typical case, string_normal_chars will match multiple, leading to a single-element vec.
pub rule raw_text() -> String = "\"" t:((string_special_char() / string_normal_chars())*) "\"" pub rule raw_text() -> String = "\"" t:((string_special_char() / string_normal_chars())*) "\""
{ t.join(&"").to_string() } { t.join(&"") }
pub rule text() -> SpannedValue pub rule text() -> SpannedValue
= v:raw_text() { SpannedValue::Text(v) } = v:raw_text() { SpannedValue::Text(v) }
@ -150,8 +148,8 @@ peg::parser!(pub grammar parse() for str {
rule inst_micros() -> DateTime<Utc> = rule inst_micros() -> DateTime<Utc> =
"#instmicros" whitespace()+ d:$( digit()+ ) { "#instmicros" whitespace()+ d:$( digit()+ ) {
let micros = d.parse::<i64>().unwrap(); let micros = d.parse::<i64>().unwrap();
let seconds: i64 = micros / 1000000; let seconds: i64 = micros / 1_000_000;
let nanos: u32 = ((micros % 1000000).abs() as u32) * 1000; let nanos: u32 = ((micros % 1_000_000).abs() as u32) * 1000;
Utc.timestamp(seconds, nanos) Utc.timestamp(seconds, nanos)
} }
@ -159,7 +157,7 @@ peg::parser!(pub grammar parse() for str {
"#instmillis" whitespace()+ d:$( digit()+ ) { "#instmillis" whitespace()+ d:$( digit()+ ) {
let millis = d.parse::<i64>().unwrap(); let millis = d.parse::<i64>().unwrap();
let seconds: i64 = millis / 1000; let seconds: i64 = millis / 1000;
let nanos: u32 = ((millis % 1000).abs() as u32) * 1000000; let nanos: u32 = ((millis % 1000).abs() as u32) * 1_000_000;
Utc.timestamp(seconds, nanos) Utc.timestamp(seconds, nanos)
} }
@ -351,7 +349,7 @@ peg::parser!(pub grammar parse() for str {
= __ "*" __ { query::PullAttributeSpec::Wildcard } = __ "*" __ { query::PullAttributeSpec::Wildcard }
/ __ k:raw_forward_namespaced_keyword() __ alias:(":as" __ alias:raw_forward_keyword() __ { alias })? { / __ k:raw_forward_namespaced_keyword() __ alias:(":as" __ alias:raw_forward_keyword() __ { alias })? {
let attribute = query::PullConcreteAttribute::Ident(::std::rc::Rc::new(k)); let attribute = query::PullConcreteAttribute::Ident(::std::rc::Rc::new(k));
let alias = alias.map(|alias| ::std::rc::Rc::new(alias)); let alias = alias.map(::std::rc::Rc::new);
query::PullAttributeSpec::Attribute( query::PullAttributeSpec::Attribute(
query::NamedPullAttribute { query::NamedPullAttribute {
attribute, attribute,
@ -525,4 +523,3 @@ peg::parser!(pub grammar parse() for str {
/ v:variable() { query::Binding::BindScalar(v) } / v:variable() { query::Binding::BindScalar(v) }
}); });

View file

@ -85,7 +85,7 @@ impl NamespaceableName {
NamespaceableName { NamespaceableName {
components: dest, components: dest,
boundary: boundary, boundary,
} }
} }
@ -144,7 +144,7 @@ impl NamespaceableName {
} }
#[inline] #[inline]
pub fn components<'a>(&'a self) -> (&'a str, &'a str) { pub fn components(&self) -> (&str, &str) {
if self.boundary > 0 { if self.boundary > 0 {
( (
&self.components[0..self.boundary], &self.components[0..self.boundary],
@ -219,11 +219,11 @@ impl<'de> Deserialize<'de> for NamespaceableName {
D: Deserializer<'de>, D: Deserializer<'de>,
{ {
let separated = SerializedNamespaceableName::deserialize(deserializer)?; let separated = SerializedNamespaceableName::deserialize(deserializer)?;
if separated.name.len() == 0 { if separated.name.is_empty() {
return Err(de::Error::custom("Empty name in keyword or symbol")); return Err(de::Error::custom("Empty name in keyword or symbol"));
} }
if let Some(ns) = separated.namespace { if let Some(ns) = separated.namespace {
if ns.len() == 0 { if ns.is_empty() {
Err(de::Error::custom( Err(de::Error::custom(
"Empty but present namespace in keyword or symbol", "Empty but present namespace in keyword or symbol",
)) ))

View file

@ -51,10 +51,6 @@ impl Variable {
self.0.as_ref().0.as_str() self.0.as_ref().0.as_str()
} }
pub fn to_string(&self) -> String {
self.0.as_ref().0.clone()
}
pub fn name(&self) -> PlainSymbol { pub fn name(&self) -> PlainSymbol {
self.0.as_ref().clone() self.0.as_ref().clone()
} }
@ -87,7 +83,7 @@ impl FromValue<Variable> for Variable {
impl Variable { impl Variable {
pub fn from_rc(sym: Rc<PlainSymbol>) -> Option<Variable> { pub fn from_rc(sym: Rc<PlainSymbol>) -> Option<Variable> {
if sym.is_var_symbol() { if sym.is_var_symbol() {
Some(Variable(sym.clone())) Some(Variable(sym))
} else { } else {
None None
} }
@ -246,18 +242,18 @@ impl FromValue<FnArg> for FnArg {
impl std::fmt::Display for FnArg { impl std::fmt::Display for FnArg {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
match self { match self {
&FnArg::Variable(ref var) => write!(f, "{}", var), FnArg::Variable(ref var) => write!(f, "{}", var),
&FnArg::SrcVar(ref var) => { FnArg::SrcVar(ref var) => {
if var == &SrcVar::DefaultSrc { if var == &SrcVar::DefaultSrc {
write!(f, "$") write!(f, "$")
} else { } else {
write!(f, "{:?}", var) write!(f, "{:?}", var)
} }
} }
&FnArg::EntidOrInteger(entid) => write!(f, "{}", entid), FnArg::EntidOrInteger(entid) => write!(f, "{}", entid),
&FnArg::IdentOrKeyword(ref kw) => write!(f, "{}", kw), FnArg::IdentOrKeyword(ref kw) => write!(f, "{}", kw),
&FnArg::Constant(ref constant) => write!(f, "{:?}", constant), FnArg::Constant(ref constant) => write!(f, "{:?}", constant),
&FnArg::Vector(ref vec) => write!(f, "{:?}", vec), FnArg::Vector(ref vec) => write!(f, "{:?}", vec),
} }
} }
} }
@ -265,7 +261,7 @@ impl std::fmt::Display for FnArg {
impl FnArg { impl FnArg {
pub fn as_variable(&self) -> Option<&Variable> { pub fn as_variable(&self) -> Option<&Variable> {
match self { match self {
&FnArg::Variable(ref v) => Some(v), FnArg::Variable(ref v) => Some(v),
_ => None, _ => None,
} }
} }
@ -332,12 +328,10 @@ impl FromValue<PatternNonValuePlace> for PatternNonValuePlace {
::SpannedValue::PlainSymbol(ref x) => { ::SpannedValue::PlainSymbol(ref x) => {
if x.0.as_str() == "_" { if x.0.as_str() == "_" {
Some(PatternNonValuePlace::Placeholder) Some(PatternNonValuePlace::Placeholder)
} else if let Some(v) = Variable::from_symbol(x) {
Some(PatternNonValuePlace::Variable(v))
} else { } else {
if let Some(v) = Variable::from_symbol(x) { None
Some(PatternNonValuePlace::Variable(v))
} else {
None
}
} }
} }
::SpannedValue::Keyword(ref x) => Some(x.clone().into()), ::SpannedValue::Keyword(ref x) => Some(x.clone().into()),
@ -404,9 +398,9 @@ impl FromValue<PatternValuePlace> for PatternValuePlace {
{ {
Some(PatternValuePlace::Constant(x.clone().into())) Some(PatternValuePlace::Constant(x.clone().into()))
} }
::SpannedValue::Uuid(ref u) => Some(PatternValuePlace::Constant( ::SpannedValue::Uuid(ref u) => {
NonIntegerConstant::Uuid(u.clone()), Some(PatternValuePlace::Constant(NonIntegerConstant::Uuid(*u)))
)), }
// These don't appear in queries. // These don't appear in queries.
::SpannedValue::Nil => None, ::SpannedValue::Nil => None,
@ -498,15 +492,15 @@ pub enum PullAttributeSpec {
impl std::fmt::Display for PullConcreteAttribute { impl std::fmt::Display for PullConcreteAttribute {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
match self { match self {
&PullConcreteAttribute::Ident(ref k) => write!(f, "{}", k), PullConcreteAttribute::Ident(ref k) => write!(f, "{}", k),
&PullConcreteAttribute::Entid(i) => write!(f, "{}", i), PullConcreteAttribute::Entid(i) => write!(f, "{}", i),
} }
} }
} }
impl std::fmt::Display for NamedPullAttribute { impl std::fmt::Display for NamedPullAttribute {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
if let &Some(ref alias) = &self.alias { if let Some(ref alias) = self.alias {
write!(f, "{} :as {}", self.attribute, alias) write!(f, "{} :as {}", self.attribute, alias)
} else { } else {
write!(f, "{}", self.attribute) write!(f, "{}", self.attribute)
@ -517,8 +511,8 @@ impl std::fmt::Display for NamedPullAttribute {
impl std::fmt::Display for PullAttributeSpec { impl std::fmt::Display for PullAttributeSpec {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
match self { match self {
&PullAttributeSpec::Wildcard => write!(f, "*"), PullAttributeSpec::Wildcard => write!(f, "*"),
&PullAttributeSpec::Attribute(ref attr) => write!(f, "{}", attr), PullAttributeSpec::Attribute(ref attr) => write!(f, "{}", attr),
} }
} }
} }
@ -553,10 +547,10 @@ impl Element {
/// Returns true if the element must yield only one value. /// Returns true if the element must yield only one value.
pub fn is_unit(&self) -> bool { pub fn is_unit(&self) -> bool {
match self { match self {
&Element::Variable(_) => false, Element::Variable(_) => false,
&Element::Pull(_) => false, Element::Pull(_) => false,
&Element::Aggregate(_) => true, Element::Aggregate(_) => true,
&Element::Corresponding(_) => true, Element::Corresponding(_) => true,
} }
} }
} }
@ -570,8 +564,8 @@ impl From<Variable> for Element {
impl std::fmt::Display for Element { impl std::fmt::Display for Element {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
match self { match self {
&Element::Variable(ref var) => write!(f, "{}", var), Element::Variable(ref var) => write!(f, "{}", var),
&Element::Pull(Pull { Element::Pull(Pull {
ref var, ref var,
ref patterns, ref patterns,
}) => { }) => {
@ -581,12 +575,12 @@ impl std::fmt::Display for Element {
} }
write!(f, "])") write!(f, "])")
} }
&Element::Aggregate(ref agg) => match agg.args.len() { Element::Aggregate(ref agg) => match agg.args.len() {
0 => write!(f, "({})", agg.func), 0 => write!(f, "({})", agg.func),
1 => write!(f, "({} {})", agg.func, agg.args[0]), 1 => write!(f, "({} {})", agg.func, agg.args[0]),
_ => write!(f, "({} {:?})", agg.func, agg.args), _ => write!(f, "({} {:?})", agg.func, agg.args),
}, },
&Element::Corresponding(ref var) => write!(f, "(the {})", var), Element::Corresponding(ref var) => write!(f, "(the {})", var),
} }
} }
} }
@ -609,20 +603,15 @@ pub enum Limit {
/// ///
/// ```rust /// ```rust
/// # use edn::query::{Element, FindSpec, Variable}; /// # use edn::query::{Element, FindSpec, Variable};
/// let elements = vec![
/// Element::Variable(Variable::from_valid_name("?foo")),
/// Element::Variable(Variable::from_valid_name("?bar")),
/// ];
/// let rel = FindSpec::FindRel(elements);
/// ///
/// # fn main() { /// if let FindSpec::FindRel(elements) = rel {
/// /// assert_eq!(2, elements.len());
/// let elements = vec![ /// }
/// Element::Variable(Variable::from_valid_name("?foo")),
/// Element::Variable(Variable::from_valid_name("?bar")),
/// ];
/// let rel = FindSpec::FindRel(elements);
///
/// if let FindSpec::FindRel(elements) = rel {
/// assert_eq!(2, elements.len());
/// }
///
/// # }
/// ``` /// ```
/// ///
#[derive(Clone, Debug, Eq, PartialEq)] #[derive(Clone, Debug, Eq, PartialEq)]
@ -649,19 +638,19 @@ impl FindSpec {
pub fn is_unit_limited(&self) -> bool { pub fn is_unit_limited(&self) -> bool {
use self::FindSpec::*; use self::FindSpec::*;
match self { match self {
&FindScalar(..) => true, FindScalar(..) => true,
&FindTuple(..) => true, FindTuple(..) => true,
&FindRel(..) => false, FindRel(..) => false,
&FindColl(..) => false, FindColl(..) => false,
} }
} }
pub fn expected_column_count(&self) -> usize { pub fn expected_column_count(&self) -> usize {
use self::FindSpec::*; use self::FindSpec::*;
match self { match self {
&FindScalar(..) => 1, FindScalar(..) => 1,
&FindColl(..) => 1, FindColl(..) => 1,
&FindTuple(ref elems) | &FindRel(ref elems) => elems.len(), FindTuple(ref elems) | &FindRel(ref elems) => elems.len(),
} }
} }
@ -690,10 +679,10 @@ impl FindSpec {
pub fn columns<'s>(&'s self) -> Box<dyn Iterator<Item = &Element> + 's> { pub fn columns<'s>(&'s self) -> Box<dyn Iterator<Item = &Element> + 's> {
use self::FindSpec::*; use self::FindSpec::*;
match self { match self {
&FindScalar(ref e) => Box::new(std::iter::once(e)), FindScalar(ref e) => Box::new(std::iter::once(e)),
&FindColl(ref e) => Box::new(std::iter::once(e)), FindColl(ref e) => Box::new(std::iter::once(e)),
&FindTuple(ref v) => Box::new(v.iter()), FindTuple(ref v) => Box::new(v.iter()),
&FindRel(ref v) => Box::new(v.iter()), FindRel(ref v) => Box::new(v.iter()),
} }
} }
} }
@ -716,8 +705,8 @@ impl VariableOrPlaceholder {
pub fn var(&self) -> Option<&Variable> { pub fn var(&self) -> Option<&Variable> {
match self { match self {
&VariableOrPlaceholder::Placeholder => None, VariableOrPlaceholder::Placeholder => None,
&VariableOrPlaceholder::Variable(ref var) => Some(var), VariableOrPlaceholder::Variable(ref var) => Some(var),
} }
} }
} }
@ -771,11 +760,11 @@ impl Binding {
/// ``` /// ```
pub fn is_valid(&self) -> bool { pub fn is_valid(&self) -> bool {
match self { match self {
&Binding::BindScalar(_) | &Binding::BindColl(_) => true, Binding::BindScalar(_) | &Binding::BindColl(_) => true,
&Binding::BindRel(ref vars) | &Binding::BindTuple(ref vars) => { Binding::BindRel(ref vars) | &Binding::BindTuple(ref vars) => {
let mut acc = HashSet::<Variable>::new(); let mut acc = HashSet::<Variable>::new();
for var in vars { for var in vars {
if let &VariableOrPlaceholder::Variable(ref var) = var { if let VariableOrPlaceholder::Variable(ref var) = *var {
if !acc.insert(var.clone()) { if !acc.insert(var.clone()) {
// It's invalid if there was an equal var already present in the set -- // It's invalid if there was an equal var already present in the set --
// i.e., we have a duplicate var. // i.e., we have a duplicate var.
@ -832,7 +821,7 @@ impl Pattern {
entity: v_e, entity: v_e,
attribute: k.to_reversed().into(), attribute: k.to_reversed().into(),
value: e_v, value: e_v,
tx: tx, tx,
}); });
} else { } else {
return None; return None;
@ -844,7 +833,7 @@ impl Pattern {
entity: e, entity: e,
attribute: a, attribute: a,
value: v, value: v,
tx: tx, tx,
}) })
} }
} }
@ -894,7 +883,7 @@ pub enum UnifyVars {
impl WhereClause { impl WhereClause {
pub fn is_pattern(&self) -> bool { pub fn is_pattern(&self) -> bool {
match self { match self {
&WhereClause::Pattern(_) => true, WhereClause::Pattern(_) => true,
_ => false, _ => false,
} }
} }
@ -909,8 +898,8 @@ pub enum OrWhereClause {
impl OrWhereClause { impl OrWhereClause {
pub fn is_pattern_or_patterns(&self) -> bool { pub fn is_pattern_or_patterns(&self) -> bool {
match self { match self {
&OrWhereClause::Clause(WhereClause::Pattern(_)) => true, OrWhereClause::Clause(WhereClause::Pattern(_)) => true,
&OrWhereClause::And(ref clauses) => clauses.iter().all(|clause| clause.is_pattern()), OrWhereClause::And(ref clauses) => clauses.iter().all(|clause| clause.is_pattern()),
_ => false, _ => false,
} }
} }
@ -934,8 +923,8 @@ pub struct NotJoin {
impl NotJoin { impl NotJoin {
pub fn new(unify_vars: UnifyVars, clauses: Vec<WhereClause>) -> NotJoin { pub fn new(unify_vars: UnifyVars, clauses: Vec<WhereClause>) -> NotJoin {
NotJoin { NotJoin {
unify_vars: unify_vars, unify_vars,
clauses: clauses, clauses,
} }
} }
} }
@ -1041,8 +1030,8 @@ impl ParsedQuery {
Ok(ParsedQuery { Ok(ParsedQuery {
find_spec: find_spec.ok_or("expected :find")?, find_spec: find_spec.ok_or("expected :find")?,
default_source: SrcVar::DefaultSrc, default_source: SrcVar::DefaultSrc,
with: with.unwrap_or(vec![]), with: with.unwrap_or_else(|| vec![]),
in_vars: in_vars.unwrap_or(vec![]), in_vars: in_vars.unwrap_or_else(|| vec![]),
in_sources: BTreeSet::default(), in_sources: BTreeSet::default(),
limit: limit.unwrap_or(Limit::None), limit: limit.unwrap_or(Limit::None),
where_clauses: where_clauses.ok_or("expected :where")?, where_clauses: where_clauses.ok_or("expected :where")?,
@ -1054,8 +1043,8 @@ impl ParsedQuery {
impl OrJoin { impl OrJoin {
pub fn new(unify_vars: UnifyVars, clauses: Vec<OrWhereClause>) -> OrJoin { pub fn new(unify_vars: UnifyVars, clauses: Vec<OrWhereClause>) -> OrJoin {
OrJoin { OrJoin {
unify_vars: unify_vars, unify_vars,
clauses: clauses, clauses,
mentioned_vars: None, mentioned_vars: None,
} }
} }
@ -1064,8 +1053,8 @@ impl OrJoin {
/// every variable mentioned inside the join is also mentioned in the `UnifyVars` list. /// every variable mentioned inside the join is also mentioned in the `UnifyVars` list.
pub fn is_fully_unified(&self) -> bool { pub fn is_fully_unified(&self) -> bool {
match &self.unify_vars { match &self.unify_vars {
&UnifyVars::Implicit => true, UnifyVars::Implicit => true,
&UnifyVars::Explicit(ref vars) => { UnifyVars::Explicit(ref vars) => {
// We know that the join list must be a subset of the vars in the pattern, or // We know that the join list must be a subset of the vars in the pattern, or
// it would have failed validation. That allows us to simply compare counts here. // it would have failed validation. That allows us to simply compare counts here.
// TODO: in debug mode, do a full intersection, and verify that our count check // TODO: in debug mode, do a full intersection, and verify that our count check
@ -1094,13 +1083,13 @@ impl ContainsVariables for WhereClause {
fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) { fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) {
use self::WhereClause::*; use self::WhereClause::*;
match self { match self {
&OrJoin(ref o) => o.accumulate_mentioned_variables(acc), OrJoin(ref o) => o.accumulate_mentioned_variables(acc),
&Pred(ref p) => p.accumulate_mentioned_variables(acc), Pred(ref p) => p.accumulate_mentioned_variables(acc),
&Pattern(ref p) => p.accumulate_mentioned_variables(acc), Pattern(ref p) => p.accumulate_mentioned_variables(acc),
&NotJoin(ref n) => n.accumulate_mentioned_variables(acc), NotJoin(ref n) => n.accumulate_mentioned_variables(acc),
&WhereFn(ref f) => f.accumulate_mentioned_variables(acc), WhereFn(ref f) => f.accumulate_mentioned_variables(acc),
&TypeAnnotation(ref a) => a.accumulate_mentioned_variables(acc), TypeAnnotation(ref a) => a.accumulate_mentioned_variables(acc),
&RuleExpr => (), RuleExpr => (),
} }
} }
} }
@ -1109,12 +1098,12 @@ impl ContainsVariables for OrWhereClause {
fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) { fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) {
use self::OrWhereClause::*; use self::OrWhereClause::*;
match self { match self {
&And(ref clauses) => { And(ref clauses) => {
for clause in clauses { for clause in clauses {
clause.accumulate_mentioned_variables(acc) clause.accumulate_mentioned_variables(acc)
} }
} }
&Clause(ref clause) => clause.accumulate_mentioned_variables(acc), Clause(ref clause) => clause.accumulate_mentioned_variables(acc),
} }
} }
} }
@ -1161,7 +1150,7 @@ impl ContainsVariables for NotJoin {
impl ContainsVariables for Predicate { impl ContainsVariables for Predicate {
fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) { fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) {
for arg in &self.args { for arg in &self.args {
if let &FnArg::Variable(ref v) = arg { if let FnArg::Variable(ref v) = *arg {
acc_ref(acc, v) acc_ref(acc, v)
} }
} }
@ -1177,10 +1166,10 @@ impl ContainsVariables for TypeAnnotation {
impl ContainsVariables for Binding { impl ContainsVariables for Binding {
fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) { fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) {
match self { match self {
&Binding::BindScalar(ref v) | &Binding::BindColl(ref v) => acc_ref(acc, v), Binding::BindScalar(ref v) | &Binding::BindColl(ref v) => acc_ref(acc, v),
&Binding::BindRel(ref vs) | &Binding::BindTuple(ref vs) => { Binding::BindRel(ref vs) | &Binding::BindTuple(ref vs) => {
for v in vs { for v in vs {
if let &VariableOrPlaceholder::Variable(ref v) = v { if let VariableOrPlaceholder::Variable(ref v) = *v {
acc_ref(acc, v); acc_ref(acc, v);
} }
} }
@ -1192,7 +1181,7 @@ impl ContainsVariables for Binding {
impl ContainsVariables for WhereFn { impl ContainsVariables for WhereFn {
fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) { fn accumulate_mentioned_variables(&self, acc: &mut BTreeSet<Variable>) {
for arg in &self.args { for arg in &self.args {
if let &FnArg::Variable(ref v) = arg { if let FnArg::Variable(ref v) = *arg {
acc_ref(acc, v) acc_ref(acc, v)
} }
} }

View file

@ -130,7 +130,7 @@ impl NamespacedSymbol {
} }
#[inline] #[inline]
pub fn components<'a>(&'a self) -> (&'a str, &'a str) { pub fn components(&self) -> (&str, &str) {
self.0.components() self.0.components()
} }
} }
@ -180,7 +180,7 @@ impl Keyword {
} }
#[inline] #[inline]
pub fn components<'a>(&'a self) -> (&'a str, &'a str) { pub fn components(&self) -> (&str, &str) {
self.0.components() self.0.components()
} }

View file

@ -328,14 +328,14 @@ macro_rules! def_common_value_methods {
pub fn is_keyword(&self) -> bool { pub fn is_keyword(&self) -> bool {
match self { match self {
&$t::Keyword(ref k) => !k.is_namespaced(), $t::Keyword(ref k) => !k.is_namespaced(),
_ => false, _ => false,
} }
} }
pub fn is_namespaced_keyword(&self) -> bool { pub fn is_namespaced_keyword(&self) -> bool {
match self { match self {
&$t::Keyword(ref k) => k.is_namespaced(), $t::Keyword(ref k) => k.is_namespaced(),
_ => false, _ => false,
} }
} }
@ -360,21 +360,21 @@ macro_rules! def_common_value_methods {
pub fn as_keyword(&self) -> Option<&symbols::Keyword> { pub fn as_keyword(&self) -> Option<&symbols::Keyword> {
match self { match self {
&$t::Keyword(ref k) => Some(k), $t::Keyword(ref k) => Some(k),
_ => None, _ => None,
} }
} }
pub fn as_plain_keyword(&self) -> Option<&symbols::Keyword> { pub fn as_plain_keyword(&self) -> Option<&symbols::Keyword> {
match self { match self {
&$t::Keyword(ref k) if !k.is_namespaced() => Some(k), $t::Keyword(ref k) if !k.is_namespaced() => Some(k),
_ => None, _ => None,
} }
} }
pub fn as_namespaced_keyword(&self) -> Option<&symbols::Keyword> { pub fn as_namespaced_keyword(&self) -> Option<&symbols::Keyword> {
match self { match self {
&$t::Keyword(ref k) if k.is_namespaced() => Some(k), $t::Keyword(ref k) if k.is_namespaced() => Some(k),
_ => None, _ => None,
} }
} }

View file

@ -22,7 +22,7 @@ where
T: Sized + Clone, T: Sized + Clone,
{ {
fn from_rc(val: Rc<T>) -> Self { fn from_rc(val: Rc<T>) -> Self {
val.clone() val
} }
fn from_arc(val: Arc<T>) -> Self { fn from_arc(val: Arc<T>) -> Self {
@ -45,7 +45,7 @@ where
} }
fn from_arc(val: Arc<T>) -> Self { fn from_arc(val: Arc<T>) -> Self {
val.clone() val
} }
} }

View file

@ -23,7 +23,11 @@ use num::traits::{One, Zero};
use ordered_float::OrderedFloat; use ordered_float::OrderedFloat;
use chrono::{TimeZone, Utc}; use chrono::{TimeZone, Utc};
use edn::{parse, symbols, types::{Span, SpannedValue, Value, ValueAndSpan}, utils, ParseError}; use edn::{
parse, symbols,
types::{Span, SpannedValue, Value, ValueAndSpan},
utils, ParseError,
};
// Helper for making wrapped keywords with a namespace. // Helper for making wrapped keywords with a namespace.
fn k_ns(ns: &str, name: &str) -> Value { fn k_ns(ns: &str, name: &str) -> Value {

View file

@ -12,7 +12,7 @@ use std; // To refer to std::result::Result.
use core_traits::{ValueType, ValueTypeSet}; use core_traits::{ValueType, ValueTypeSet};
use edn::{ ParseError, query::PlainSymbol }; use edn::{query::PlainSymbol, ParseError};
pub type Result<T> = std::result::Result<T, AlgebrizerError>; pub type Result<T> = std::result::Result<T, AlgebrizerError>;

View file

@ -116,7 +116,7 @@ impl ConjoiningClauses {
let constrained_types; let constrained_types;
if let Some(required) = self.required_types.get(var) { if let Some(required) = self.required_types.get(var) {
constrained_types = known_types.intersection(required); constrained_types = known_types.intersection(*required);
} else { } else {
constrained_types = known_types; constrained_types = known_types;
} }

View file

@ -90,7 +90,7 @@ impl ConjoiningClauses {
let mut args = where_fn.args.into_iter(); let mut args = where_fn.args.into_iter();
// TODO: process source variables. // TODO(gburd): process source variables.
match args.next().unwrap() { match args.next().unwrap() {
FnArg::SrcVar(SrcVar::DefaultSrc) => {} FnArg::SrcVar(SrcVar::DefaultSrc) => {}
_ => bail!(AlgebrizerError::InvalidArgument( _ => bail!(AlgebrizerError::InvalidArgument(
@ -104,7 +104,7 @@ impl ConjoiningClauses {
// TODO: accept placeholder and set of attributes. Alternately, consider putting the search // TODO: accept placeholder and set of attributes. Alternately, consider putting the search
// term before the attribute arguments and collect the (variadic) attributes into a set. // term before the attribute arguments and collect the (variadic) attributes into a set.
// let a: Entid = self.resolve_attribute_argument(&where_fn.operator, 1, args.next().unwrap())?; // let a: Entid = self.resolve_attribute_argument(&where_fn.operator, 1, args.next().unwrap())?;
// //
// TODO: improve the expression of this matching, possibly by using attribute_for_* uniformly. // TODO: improve the expression of this matching, possibly by using attribute_for_* uniformly.
let a = match args.next().unwrap() { let a = match args.next().unwrap() {
@ -117,7 +117,7 @@ impl ConjoiningClauses {
match self.bound_value(&v) { match self.bound_value(&v) {
Some(TypedValue::Ref(entid)) => Some(entid), Some(TypedValue::Ref(entid)) => Some(entid),
Some(tv) => bail!(AlgebrizerError::InputTypeDisagreement( Some(tv) => bail!(AlgebrizerError::InputTypeDisagreement(
v.name().clone(), v.name(),
ValueType::Ref, ValueType::Ref,
tv.value_type() tv.value_type()
)), )),
@ -130,20 +130,13 @@ impl ConjoiningClauses {
// An unknown ident, or an entity that isn't present in the store, or isn't a fulltext // An unknown ident, or an entity that isn't present in the store, or isn't a fulltext
// attribute, is likely enough to be a coding error that we choose to bail instead of // attribute, is likely enough to be a coding error that we choose to bail instead of
// marking the pattern as known-empty. // marking the pattern as known-empty.
let a = a.ok_or(AlgebrizerError::InvalidArgument( let op = where_fn.operator.clone(); //TODO(gburd): remove me...
where_fn.operator.clone(), let a = a.ok_or_else(move || AlgebrizerError::InvalidArgument(op, "attribute", 1))?;
"attribute", let op = where_fn.operator.clone(); //TODO(gburd): remove me...
1, let attribute = schema
))?; .attribute_for_entid(a)
let attribute = .cloned()
schema .ok_or_else(move || AlgebrizerError::InvalidArgument(op, "attribute", 1))?;
.attribute_for_entid(a)
.cloned()
.ok_or(AlgebrizerError::InvalidArgument(
where_fn.operator.clone(),
"attribute",
1,
))?;
if !attribute.fulltext { if !attribute.fulltext {
// We can never get results from a non-fulltext attribute! // We can never get results from a non-fulltext attribute!
@ -271,7 +264,7 @@ impl ConjoiningClauses {
self.bind_column_to_var( self.bind_column_to_var(
schema, schema,
fulltext_values_alias.clone(), fulltext_values_alias,
Column::Fulltext(FulltextColumn::Text), Column::Fulltext(FulltextColumn::Text),
var.clone(), var.clone(),
); );
@ -284,12 +277,7 @@ impl ConjoiningClauses {
return Ok(()); return Ok(());
} }
self.bind_column_to_var( self.bind_column_to_var(schema, datoms_table_alias, DatomsColumn::Tx, var.clone());
schema,
datoms_table_alias.clone(),
DatomsColumn::Tx,
var.clone(),
);
} }
if let VariableOrPlaceholder::Variable(ref var) = b_score { if let VariableOrPlaceholder::Variable(ref var) = b_score {

View file

@ -47,7 +47,7 @@ impl ConjoiningClauses {
let named_values = ComputedTable::NamedValues { let named_values = ComputedTable::NamedValues {
names: names.clone(), names: names.clone(),
values: values, values,
}; };
let table = self.computed_tables.push_computed(named_values); let table = self.computed_tables.push_computed(named_values);
@ -103,13 +103,13 @@ impl ConjoiningClauses {
if existing != value { if existing != value {
self.mark_known_empty(EmptyBecause::ConflictingBindings { self.mark_known_empty(EmptyBecause::ConflictingBindings {
var: var.clone(), var: var.clone(),
existing: existing.clone(), existing,
desired: value, desired: value,
}); });
return Ok(()); return Ok(());
} }
} else { } else {
self.bind_value(&var, value.clone()); self.bind_value(&var, value);
} }
Ok(()) Ok(())
@ -180,7 +180,7 @@ impl ConjoiningClauses {
.into_iter() .into_iter()
.filter_map(|arg| -> Option<Result<TypedValue>> { .filter_map(|arg| -> Option<Result<TypedValue>> {
// We need to get conversion errors out. // We need to get conversion errors out.
// We also want to mark known-empty on impossibilty, but // We also want to mark known-empty on impossibility, but
// still detect serious errors. // still detect serious errors.
match self.typed_value_from_arg(schema, &var, arg, known_types) { match self.typed_value_from_arg(schema, &var, arg, known_types) {
Ok(ValueConversion::Val(tv)) => { Ok(ValueConversion::Val(tv)) => {
@ -188,7 +188,7 @@ impl ConjoiningClauses {
&& !accumulated_types.is_unit() && !accumulated_types.is_unit()
{ {
// Values not all of the same type. // Values not all of the same type.
Some(Err(AlgebrizerError::InvalidGroundConstant.into())) Some(Err(AlgebrizerError::InvalidGroundConstant))
} else { } else {
Some(Ok(tv)) Some(Ok(tv))
} }
@ -198,7 +198,7 @@ impl ConjoiningClauses {
skip = Some(because); skip = Some(because);
None None
} }
Err(e) => Some(Err(e.into())), Err(e) => Some(Err(e)),
} }
}) })
.collect::<Result<Vec<TypedValue>>>()?; .collect::<Result<Vec<TypedValue>>>()?;
@ -211,7 +211,7 @@ impl ConjoiningClauses {
// Otherwise, we now have the values and the type. // Otherwise, we now have the values and the type.
let types = vec![accumulated_types.exemplar().unwrap()]; let types = vec![accumulated_types.exemplar().unwrap()];
let names = vec![var.clone()]; let names = vec![var];
self.collect_named_bindings(schema, names, types, values); self.collect_named_bindings(schema, names, types, values);
Ok(()) Ok(())
@ -227,8 +227,8 @@ impl ConjoiningClauses {
let template: Vec<Option<(Variable, ValueTypeSet)>> = places let template: Vec<Option<(Variable, ValueTypeSet)>> = places
.iter() .iter()
.map(|x| match x { .map(|x| match x {
&VariableOrPlaceholder::Placeholder => None, VariableOrPlaceholder::Placeholder => None,
&VariableOrPlaceholder::Variable(ref v) => { VariableOrPlaceholder::Variable(ref v) => {
Some((v.clone(), self.known_type_set(v))) Some((v.clone(), self.known_type_set(v)))
} }
}) })
@ -271,7 +271,7 @@ impl ConjoiningClauses {
// Convert each item in the row. // Convert each item in the row.
// If any value in the row is impossible, then skip the row. // If any value in the row is impossible, then skip the row.
// If all rows are impossible, fail the entire CC. // If all rows are impossible, fail the entire CC.
if let &Some(ref pair) = pair { if let Some(ref pair) = pair {
match self.typed_value_from_arg(schema, &pair.0, col, pair.1)? { match self.typed_value_from_arg(schema, &pair.0, col, pair.1)? {
ValueConversion::Val(tv) => vals.push(tv), ValueConversion::Val(tv) => vals.push(tv),
ValueConversion::Impossible(because) => { ValueConversion::Impossible(because) => {

View file

@ -55,7 +55,7 @@ impl QueryInputs {
.iter() .iter()
.map(|(var, val)| (var.clone(), val.value_type())) .map(|(var, val)| (var.clone(), val.value_type()))
.collect(), .collect(),
values: values, values,
} }
} }
@ -73,9 +73,6 @@ impl QueryInputs {
} }
} }
} }
Ok(QueryInputs { Ok(QueryInputs { types, values })
types: types,
values: values,
})
} }
} }

View file

@ -147,8 +147,8 @@ pub struct ConjoiningClauses {
/// A map from var to qualified columns. Used to project. /// A map from var to qualified columns. Used to project.
pub column_bindings: BTreeMap<Variable, Vec<QualifiedAlias>>, pub column_bindings: BTreeMap<Variable, Vec<QualifiedAlias>>,
/// A list of variables mentioned in the enclosing query's :in clause. These must all be bound /// A list of variables mentioned in the enclosing query's `:in` clause all of which must be
/// before the query can be executed. TODO: clarify what this means for nested CCs. /// bound before the query can be executed. TODO: clarify what this means for nested CCs.
pub input_variables: BTreeSet<Variable>, pub input_variables: BTreeSet<Variable>,
/// In some situations -- e.g., when a query is being run only once -- we know in advance the /// In some situations -- e.g., when a query is being run only once -- we know in advance the
@ -279,7 +279,7 @@ impl ConjoiningClauses {
values.keep_intersected_keys(&in_variables); values.keep_intersected_keys(&in_variables);
let mut cc = ConjoiningClauses { let mut cc = ConjoiningClauses {
alias_counter: alias_counter, alias_counter,
input_variables: in_variables, input_variables: in_variables,
value_bindings: values, value_bindings: values,
..Default::default() ..Default::default()
@ -301,14 +301,8 @@ impl ConjoiningClauses {
impl ConjoiningClauses { impl ConjoiningClauses {
pub(crate) fn derive_types_from_find_spec(&mut self, find_spec: &FindSpec) { pub(crate) fn derive_types_from_find_spec(&mut self, find_spec: &FindSpec) {
for spec in find_spec.columns() { for spec in find_spec.columns() {
match spec { if let Element::Pull(Pull { ref var, .. }) = spec {
&Element::Pull(Pull { self.constrain_var_to_type(var.clone(), ValueType::Ref);
ref var,
patterns: _,
}) => {
self.constrain_var_to_type(var.clone(), ValueType::Ref);
}
_ => {}
} }
} }
} }
@ -410,7 +404,7 @@ impl ConjoiningClauses {
self.known_types self.known_types
.get(var) .get(var)
.cloned() .cloned()
.unwrap_or(ValueTypeSet::any()) .unwrap_or_else(ValueTypeSet::any)
} }
pub(crate) fn bind_column_to_var<C: Into<Column>>( pub(crate) fn bind_column_to_var<C: Into<Column>>(
@ -514,7 +508,7 @@ impl ConjoiningClauses {
self.column_bindings self.column_bindings
.entry(var) .entry(var)
.or_insert(vec![]) .or_insert_with(|| vec![])
.push(alias); .push(alias);
} }
@ -585,10 +579,10 @@ impl ConjoiningClauses {
these_types: ValueTypeSet, these_types: ValueTypeSet,
) -> Option<EmptyBecause> { ) -> Option<EmptyBecause> {
if let Some(existing) = self.known_types.get(var) { if let Some(existing) = self.known_types.get(var) {
if existing.intersection(&these_types).is_empty() { if existing.intersection(these_types).is_empty() {
return Some(EmptyBecause::TypeMismatch { return Some(EmptyBecause::TypeMismatch {
var: var.clone(), var: var.clone(),
existing: existing.clone(), existing: *existing,
desired: these_types, desired: these_types,
}); });
} }
@ -640,7 +634,7 @@ impl ConjoiningClauses {
// We have an existing requirement. The new requirement will be // We have an existing requirement. The new requirement will be
// the intersection, but we'll `mark_known_empty` if that's empty. // the intersection, but we'll `mark_known_empty` if that's empty.
let existing = *entry.get(); let existing = *entry.get();
let intersection = types.intersection(&existing); let intersection = types.intersection(existing);
entry.insert(intersection); entry.insert(intersection);
if !intersection.is_empty() { if !intersection.is_empty() {
@ -648,8 +642,8 @@ impl ConjoiningClauses {
} }
EmptyBecause::TypeMismatch { EmptyBecause::TypeMismatch {
var: var, var,
existing: existing, existing,
desired: types, desired: types,
} }
} }
@ -684,7 +678,7 @@ impl ConjoiningClauses {
panic!("Uh oh: we failed this pattern, probably because {:?} couldn't match, but now we're broadening its type.", panic!("Uh oh: we failed this pattern, probably because {:?} couldn't match, but now we're broadening its type.",
e.key()); e.key());
} }
new = existing_types.union(&new_types); new = existing_types.union(new_types);
} }
e.insert(new); e.insert(new);
} }
@ -710,11 +704,11 @@ impl ConjoiningClauses {
e.insert(types); e.insert(types);
} }
Entry::Occupied(mut e) => { Entry::Occupied(mut e) => {
let intersected: ValueTypeSet = types.intersection(e.get()); let intersected: ValueTypeSet = types.intersection(*e.get());
if intersected.is_empty() { if intersected.is_empty() {
let reason = EmptyBecause::TypeMismatch { let reason = EmptyBecause::TypeMismatch {
var: e.key().clone(), var: e.key().clone(),
existing: e.get().clone(), existing: *e.get(),
desired: types, desired: types,
}; };
empty_because = Some(reason); empty_because = Some(reason);
@ -751,7 +745,7 @@ impl ConjoiningClauses {
// If it's a variable, record that it has the right type. // If it's a variable, record that it has the right type.
// Ident or attribute resolution errors (the only other check we need to do) will be done // Ident or attribute resolution errors (the only other check we need to do) will be done
// by the caller. // by the caller.
if let &EvolvedNonValuePlace::Variable(ref v) = value { if let EvolvedNonValuePlace::Variable(ref v) = value {
self.constrain_var_to_type(v.clone(), ValueType::Ref) self.constrain_var_to_type(v.clone(), ValueType::Ref)
} }
} }
@ -784,12 +778,12 @@ impl ConjoiningClauses {
) -> ::std::result::Result<DatomsTable, EmptyBecause> { ) -> ::std::result::Result<DatomsTable, EmptyBecause> {
if attribute.fulltext { if attribute.fulltext {
match value { match value {
&EvolvedValuePlace::Placeholder => Ok(DatomsTable::Datoms), // We don't need the value. EvolvedValuePlace::Placeholder => Ok(DatomsTable::Datoms), // We don't need the value.
// TODO: an existing non-string binding can cause this pattern to fail. // TODO: an existing non-string binding can cause this pattern to fail.
&EvolvedValuePlace::Variable(_) => Ok(DatomsTable::FulltextDatoms), EvolvedValuePlace::Variable(_) => Ok(DatomsTable::FulltextDatoms),
&EvolvedValuePlace::Value(TypedValue::String(_)) => Ok(DatomsTable::FulltextDatoms), EvolvedValuePlace::Value(TypedValue::String(_)) => Ok(DatomsTable::FulltextDatoms),
_ => { _ => {
// We can't succeed if there's a non-string constant value for a fulltext // We can't succeed if there's a non-string constant value for a fulltext
@ -802,9 +796,9 @@ impl ConjoiningClauses {
} }
} }
fn table_for_unknown_attribute<'s, 'a>( fn table_for_unknown_attribute(
&self, &self,
value: &'a EvolvedValuePlace, value: &EvolvedValuePlace,
) -> ::std::result::Result<DatomsTable, EmptyBecause> { ) -> ::std::result::Result<DatomsTable, EmptyBecause> {
// If the value is known to be non-textual, we can simply use the regular datoms // If the value is known to be non-textual, we can simply use the regular datoms
// table (TODO: and exclude on `index_fulltext`!). // table (TODO: and exclude on `index_fulltext`!).
@ -817,7 +811,7 @@ impl ConjoiningClauses {
Ok(match value { Ok(match value {
// TODO: see if the variable is projected, aggregated, or compared elsewhere in // TODO: see if the variable is projected, aggregated, or compared elsewhere in
// the query. If it's not, we don't need to use all_datoms here. // the query. If it's not, we don't need to use all_datoms here.
&EvolvedValuePlace::Variable(ref v) => { EvolvedValuePlace::Variable(ref v) => {
// If `required_types` and `known_types` don't exclude strings, // If `required_types` and `known_types` don't exclude strings,
// we need to query `all_datoms`. // we need to query `all_datoms`.
if self if self
@ -834,7 +828,7 @@ impl ConjoiningClauses {
DatomsTable::Datoms DatomsTable::Datoms
} }
} }
&EvolvedValuePlace::Value(TypedValue::String(_)) => DatomsTable::AllDatoms, EvolvedValuePlace::Value(TypedValue::String(_)) => DatomsTable::AllDatoms,
_ => DatomsTable::Datoms, _ => DatomsTable::Datoms,
}) })
} }
@ -850,14 +844,14 @@ impl ConjoiningClauses {
value: &'a EvolvedValuePlace, value: &'a EvolvedValuePlace,
) -> ::std::result::Result<DatomsTable, EmptyBecause> { ) -> ::std::result::Result<DatomsTable, EmptyBecause> {
match attribute { match attribute {
&EvolvedNonValuePlace::Entid(id) => schema EvolvedNonValuePlace::Entid(id) => schema
.attribute_for_entid(id) .attribute_for_entid(*id)
.ok_or_else(|| EmptyBecause::InvalidAttributeEntid(id)) .ok_or_else(|| EmptyBecause::InvalidAttributeEntid(*id))
.and_then(|attribute| self.table_for_attribute_and_value(attribute, value)), .and_then(|attribute| self.table_for_attribute_and_value(attribute, value)),
// TODO: In a prepared context, defer this decision until a second algebrizing phase. // TODO: In a prepared context, defer this decision until a second algebrizing phase.
// #278. // #278.
&EvolvedNonValuePlace::Placeholder => self.table_for_unknown_attribute(value), EvolvedNonValuePlace::Placeholder => self.table_for_unknown_attribute(value),
&EvolvedNonValuePlace::Variable(ref v) => { EvolvedNonValuePlace::Variable(ref v) => {
// See if we have a binding for the variable. // See if we have a binding for the variable.
match self.bound_value(v) { match self.bound_value(v) {
// TODO: In a prepared context, defer this decision until a second algebrizing phase. // TODO: In a prepared context, defer this decision until a second algebrizing phase.
@ -883,7 +877,7 @@ impl ConjoiningClauses {
// attribute place. // attribute place.
Err(EmptyBecause::InvalidBinding( Err(EmptyBecause::InvalidBinding(
Column::Fixed(DatomsColumn::Attribute), Column::Fixed(DatomsColumn::Attribute),
v.clone(), v,
)) ))
} }
} }
@ -922,8 +916,8 @@ impl ConjoiningClauses {
) -> Option<&'s Attribute> { ) -> Option<&'s Attribute> {
match value { match value {
// We know this one is known if the attribute lookup succeeds… // We know this one is known if the attribute lookup succeeds…
&TypedValue::Ref(id) => schema.attribute_for_entid(id), TypedValue::Ref(id) => schema.attribute_for_entid(*id),
&TypedValue::Keyword(ref kw) => schema.attribute_for_ident(kw).map(|(a, _id)| a), TypedValue::Keyword(ref kw) => schema.attribute_for_ident(kw).map(|(a, _id)| a),
_ => None, _ => None,
} }
} }
@ -981,7 +975,7 @@ impl ConjoiningClauses {
pub(crate) fn expand_column_bindings(&mut self) { pub(crate) fn expand_column_bindings(&mut self) {
for cols in self.column_bindings.values() { for cols in self.column_bindings.values() {
if cols.len() > 1 { if cols.len() > 1 {
let ref primary = cols[0]; let primary = &cols[0];
let secondaries = cols.iter().skip(1); let secondaries = cols.iter().skip(1);
for secondary in secondaries { for secondary in secondaries {
// TODO: if both primary and secondary are .v, should we make sure // TODO: if both primary and secondary are .v, should we make sure
@ -1029,18 +1023,18 @@ impl ConjoiningClauses {
let mut empty_because: Option<EmptyBecause> = None; let mut empty_because: Option<EmptyBecause> = None;
for (var, types) in self.required_types.clone().into_iter() { for (var, types) in self.required_types.clone().into_iter() {
if let Some(already_known) = self.known_types.get(&var) { if let Some(already_known) = self.known_types.get(&var) {
if already_known.is_disjoint(&types) { if already_known.is_disjoint(types) {
// If we know the constraint can't be one of the types // If we know the constraint can't be one of the types
// the variable could take, then we know we're empty. // the variable could take, then we know we're empty.
empty_because = Some(EmptyBecause::TypeMismatch { empty_because = Some(EmptyBecause::TypeMismatch {
var: var, var,
existing: *already_known, existing: *already_known,
desired: types, desired: types,
}); });
break; break;
} }
if already_known.is_subset(&types) { if already_known.is_subset(types) {
// TODO: I'm not convinced that we can do nothing here. // TODO: I'm not convinced that we can do nothing here.
// //
// Consider `[:find ?x ?v :where [_ _ ?v] [(> ?v 10)] [?x :foo/long ?v]]`. // Consider `[:find ?x ?v :where [_ _ ?v] [(> ?v 10)] [?x :foo/long ?v]]`.
@ -1129,7 +1123,7 @@ impl ConjoiningClauses {
} }
fn mark_as_ref(&mut self, pos: &PatternNonValuePlace) { fn mark_as_ref(&mut self, pos: &PatternNonValuePlace) {
if let &PatternNonValuePlace::Variable(ref var) = pos { if let PatternNonValuePlace::Variable(ref var) = pos {
self.constrain_var_to_type(var.clone(), ValueType::Ref) self.constrain_var_to_type(var.clone(), ValueType::Ref)
} }
} }
@ -1142,13 +1136,13 @@ impl ConjoiningClauses {
// We apply (top level) type predicates first as an optimization. // We apply (top level) type predicates first as an optimization.
for clause in where_clauses.iter() { for clause in where_clauses.iter() {
match clause { match clause {
&WhereClause::TypeAnnotation(ref anno) => { WhereClause::TypeAnnotation(ref anno) => {
self.apply_type_anno(anno)?; self.apply_type_anno(anno)?;
} }
// Patterns are common, so let's grab as much type information from // Patterns are common, so let's grab as much type information from
// them as we can. // them as we can.
&WhereClause::Pattern(ref p) => { WhereClause::Pattern(ref p) => {
self.mark_as_ref(&p.entity); self.mark_as_ref(&p.entity);
self.mark_as_ref(&p.attribute); self.mark_as_ref(&p.attribute);
self.mark_as_ref(&p.tx); self.mark_as_ref(&p.tx);
@ -1167,7 +1161,7 @@ impl ConjoiningClauses {
let mut patterns: VecDeque<EvolvedPattern> = VecDeque::with_capacity(remaining); let mut patterns: VecDeque<EvolvedPattern> = VecDeque::with_capacity(remaining);
for clause in where_clauses { for clause in where_clauses {
remaining -= 1; remaining -= 1;
if let &WhereClause::TypeAnnotation(_) = &clause { if let WhereClause::TypeAnnotation(_) = &clause {
continue; continue;
} }
match clause { match clause {

View file

@ -642,7 +642,7 @@ impl ConjoiningClauses {
// For any variable which has an imprecise type anywhere in the UNION, add it to the // For any variable which has an imprecise type anywhere in the UNION, add it to the
// set that needs type extraction. All UNION arms must project the same columns. // set that needs type extraction. All UNION arms must project the same columns.
for var in projection.iter() { for var in projection.iter() {
if acc.iter().any(|cc| !cc.known_type(var).is_some()) { if acc.iter().any(|cc| cc.known_type(var).is_none()) {
type_needed.insert(var.clone()); type_needed.insert(var.clone());
} }
} }
@ -672,7 +672,7 @@ impl ConjoiningClauses {
} }
let union = ComputedTable::Union { let union = ComputedTable::Union {
projection: projection, projection,
type_extraction: type_needed, type_extraction: type_needed,
arms: acc, arms: acc,
}; };
@ -730,7 +730,7 @@ fn union_types(
e.insert(new_types.clone()); e.insert(new_types.clone());
} }
Entry::Occupied(mut e) => { Entry::Occupied(mut e) => {
let new = e.get().union(&new_types); let new = e.get().union(*new_types);
e.insert(new); e.insert(new);
} }
} }

View file

@ -8,6 +8,8 @@
// CONDITIONS OF ANY KIND, either express or implied. See the License for the // CONDITIONS OF ANY KIND, either express or implied. See the License for the
// specific language governing permissions and limitations under the License. // specific language governing permissions and limitations under the License.
#![allow(clippy::single_match)]
use core_traits::{Entid, TypedValue, ValueType, ValueTypeSet}; use core_traits::{Entid, TypedValue, ValueType, ValueTypeSet};
use mentat_core::{Cloned, HasSchema}; use mentat_core::{Cloned, HasSchema};
@ -27,7 +29,7 @@ use Known;
pub fn into_typed_value(nic: NonIntegerConstant) -> TypedValue { pub fn into_typed_value(nic: NonIntegerConstant) -> TypedValue {
match nic { match nic {
NonIntegerConstant::BigInteger(_) => unimplemented!(), // TODO: #280. NonIntegerConstant::BigInteger(_) => unimplemented!(), // TODO(gburd): #280.
NonIntegerConstant::Boolean(v) => TypedValue::Boolean(v), NonIntegerConstant::Boolean(v) => TypedValue::Boolean(v),
NonIntegerConstant::Float(v) => TypedValue::Double(v), NonIntegerConstant::Float(v) => TypedValue::Double(v),
NonIntegerConstant::Text(v) => v.into(), NonIntegerConstant::Text(v) => v.into(),
@ -93,17 +95,15 @@ impl ConjoiningClauses {
self.constrain_to_ref(&pattern.entity); self.constrain_to_ref(&pattern.entity);
self.constrain_to_ref(&pattern.attribute); self.constrain_to_ref(&pattern.attribute);
let ref col = alias.1; let col = &alias.1;
let schema = known.schema; let schema = known.schema;
match pattern.entity { match pattern.entity {
EvolvedNonValuePlace::Placeholder => EvolvedNonValuePlace::Placeholder =>
// Placeholders don't contribute any column bindings, nor do // Placeholders don't contribute any column bindings, nor do
// they constrain the query -- there's no need to produce // they constrain the query -- there's no need to produce
// IS NOT NULL, because we don't store nulls in our schema. // IS NOT NULL, because we don't store nulls in our schema.
{ {}
()
}
EvolvedNonValuePlace::Variable(ref v) => { EvolvedNonValuePlace::Variable(ref v) => {
self.bind_column_to_var(schema, col.clone(), DatomsColumn::Entity, v.clone()) self.bind_column_to_var(schema, col.clone(), DatomsColumn::Entity, v.clone())
} }
@ -287,7 +287,7 @@ impl ConjoiningClauses {
None => { None => {
self.mark_known_empty(EmptyBecause::CachedAttributeHasNoEntity { self.mark_known_empty(EmptyBecause::CachedAttributeHasNoEntity {
value: val.clone(), value: val.clone(),
attr: attr, attr,
}); });
true true
} }
@ -301,7 +301,7 @@ impl ConjoiningClauses {
None => { None => {
self.mark_known_empty(EmptyBecause::CachedAttributeHasNoEntity { self.mark_known_empty(EmptyBecause::CachedAttributeHasNoEntity {
value: val.clone(), value: val.clone(),
attr: attr, attr,
}); });
true true
} }
@ -403,8 +403,8 @@ impl ConjoiningClauses {
None => { None => {
self.mark_known_empty( self.mark_known_empty(
EmptyBecause::CachedAttributeHasNoValues { EmptyBecause::CachedAttributeHasNoValues {
entity: entity, entity,
attr: attr, attr,
}, },
); );
return true; return true;
@ -416,7 +416,7 @@ impl ConjoiningClauses {
} }
} }
} }
_ => {} // TODO: check constant values against cache. _ => {} // TODO: check constant values against the cache.
} }
} }
_ => {} _ => {}
@ -591,7 +591,7 @@ impl ConjoiningClauses {
entity: e, entity: e,
attribute: a, attribute: a,
value: v, value: v,
tx: tx, tx,
}), }),
}, },
}, },
@ -612,7 +612,7 @@ impl ConjoiningClauses {
let mut new_value: Option<EvolvedValuePlace> = None; let mut new_value: Option<EvolvedValuePlace> = None;
match &pattern.entity { match &pattern.entity {
&EvolvedNonValuePlace::Variable(ref var) => { EvolvedNonValuePlace::Variable(ref var) => {
// See if we have it yet! // See if we have it yet!
match self.bound_value(&var) { match self.bound_value(&var) {
None => (), None => (),
@ -631,12 +631,12 @@ impl ConjoiningClauses {
_ => (), _ => (),
} }
match &pattern.value { match &pattern.value {
&EvolvedValuePlace::Variable(ref var) => { EvolvedValuePlace::Variable(ref var) => {
// See if we have it yet! // See if we have it yet!
match self.bound_value(&var) { match self.bound_value(&var) {
None => (), None => (),
Some(tv) => { Some(tv) => {
new_value = Some(EvolvedValuePlace::Value(tv.clone())); new_value = Some(EvolvedValuePlace::Value(tv));
} }
}; };
} }
@ -679,7 +679,6 @@ impl ConjoiningClauses {
// between an attribute and a value. // between an attribute and a value.
// We know we cannot return a result, so we short-circuit here. // We know we cannot return a result, so we short-circuit here.
self.mark_known_empty(EmptyBecause::AttributeLookupFailed); self.mark_known_empty(EmptyBecause::AttributeLookupFailed);
return;
} }
} }
} }

View file

@ -44,7 +44,7 @@ impl ConjoiningClauses {
fn potential_types(&self, schema: &Schema, fn_arg: &FnArg) -> Result<ValueTypeSet> { fn potential_types(&self, schema: &Schema, fn_arg: &FnArg) -> Result<ValueTypeSet> {
match fn_arg { match fn_arg {
&FnArg::Variable(ref v) => Ok(self.known_type_set(v)), FnArg::Variable(ref v) => Ok(self.known_type_set(v)),
_ => fn_arg.potential_types(schema), _ => fn_arg.potential_types(schema),
} }
} }
@ -95,7 +95,7 @@ impl ConjoiningClauses {
let supported_types = comparison.supported_types(); let supported_types = comparison.supported_types();
let mut left_types = self let mut left_types = self
.potential_types(known.schema, &left)? .potential_types(known.schema, &left)?
.intersection(&supported_types); .intersection(supported_types);
if left_types.is_empty() { if left_types.is_empty() {
bail!(AlgebrizerError::InvalidArgumentType( bail!(AlgebrizerError::InvalidArgumentType(
predicate.operator.clone(), predicate.operator.clone(),
@ -106,7 +106,7 @@ impl ConjoiningClauses {
let mut right_types = self let mut right_types = self
.potential_types(known.schema, &right)? .potential_types(known.schema, &right)?
.intersection(&supported_types); .intersection(supported_types);
if right_types.is_empty() { if right_types.is_empty() {
bail!(AlgebrizerError::InvalidArgumentType( bail!(AlgebrizerError::InvalidArgumentType(
predicate.operator.clone(), predicate.operator.clone(),
@ -125,7 +125,7 @@ impl ConjoiningClauses {
left_types.insert(ValueType::Double); left_types.insert(ValueType::Double);
} }
let shared_types = left_types.intersection(&right_types); let shared_types = left_types.intersection(right_types);
if shared_types.is_empty() { if shared_types.is_empty() {
// In isolation these are both valid inputs to the operator, but the query cannot // In isolation these are both valid inputs to the operator, but the query cannot
// succeed because the types don't match. // succeed because the types don't match.
@ -176,8 +176,8 @@ impl ConjoiningClauses {
} }
impl Inequality { impl Inequality {
fn to_constraint(&self, left: QueryValue, right: QueryValue) -> ColumnConstraint { fn to_constraint(self, left: QueryValue, right: QueryValue) -> ColumnConstraint {
match *self { match self {
Inequality::TxAfter | Inequality::TxBefore => { Inequality::TxAfter | Inequality::TxBefore => {
// TODO: both ends of the range must be inside the tx partition! // TODO: both ends of the range must be inside the tx partition!
// If we know the partition map -- and at this point we do, it's just // If we know the partition map -- and at this point we do, it's just
@ -188,9 +188,9 @@ impl Inequality {
} }
ColumnConstraint::Inequality { ColumnConstraint::Inequality {
operator: *self, operator: self,
left: left, left,
right: right, right,
} }
} }
} }

View file

@ -41,14 +41,14 @@ impl ConjoiningClauses {
if v.value_type().is_numeric() { if v.value_type().is_numeric() {
Ok(QueryValue::TypedValue(v)) Ok(QueryValue::TypedValue(v))
} else { } else {
bail!(AlgebrizerError::InputTypeDisagreement(var.name().clone(), ValueType::Long, v.value_type())) bail!(AlgebrizerError::InputTypeDisagreement(var.name(), ValueType::Long, v.value_type()))
} }
} else { } else {
self.constrain_var_to_numeric(var.clone()); self.constrain_var_to_numeric(var.clone());
self.column_bindings self.column_bindings
.get(&var) .get(&var)
.and_then(|cols| cols.first().map(|col| QueryValue::Column(col.clone()))) .and_then(|cols| cols.first().map(|col| QueryValue::Column(col.clone())))
.ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()).into()) .ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()))
} }
}, },
// Can't be an entid. // Can't be an entid.
@ -80,7 +80,7 @@ impl ConjoiningClauses {
FnArg::Variable(var) => match self.bound_value(&var) { FnArg::Variable(var) => match self.bound_value(&var) {
Some(TypedValue::Instant(v)) => Ok(QueryValue::TypedValue(TypedValue::Instant(v))), Some(TypedValue::Instant(v)) => Ok(QueryValue::TypedValue(TypedValue::Instant(v))),
Some(v) => bail!(AlgebrizerError::InputTypeDisagreement( Some(v) => bail!(AlgebrizerError::InputTypeDisagreement(
var.name().clone(), var.name(),
ValueType::Instant, ValueType::Instant,
v.value_type() v.value_type()
)), )),
@ -89,7 +89,7 @@ impl ConjoiningClauses {
self.column_bindings self.column_bindings
.get(&var) .get(&var)
.and_then(|cols| cols.first().map(|col| QueryValue::Column(col.clone()))) .and_then(|cols| cols.first().map(|col| QueryValue::Column(col.clone())))
.ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()).into()) .ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()))
} }
}, },
Constant(NonIntegerConstant::Instant(v)) => { Constant(NonIntegerConstant::Instant(v)) => {
@ -136,14 +136,14 @@ impl ConjoiningClauses {
self.column_bindings self.column_bindings
.get(&var) .get(&var)
.and_then(|cols| cols.first().map(|col| QueryValue::Column(col.clone()))) .and_then(|cols| cols.first().map(|col| QueryValue::Column(col.clone())))
.ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()).into()) .ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()))
} }
} }
EntidOrInteger(i) => Ok(QueryValue::TypedValue(TypedValue::Ref(i))), EntidOrInteger(i) => Ok(QueryValue::TypedValue(TypedValue::Ref(i))),
IdentOrKeyword(i) => schema IdentOrKeyword(i) => schema
.get_entid(&i) .get_entid(&i)
.map(|known_entid| QueryValue::Entid(known_entid.into())) .map(|known_entid| QueryValue::Entid(known_entid.into()))
.ok_or_else(|| AlgebrizerError::UnrecognizedIdent(i.to_string()).into()), .ok_or_else(|| AlgebrizerError::UnrecognizedIdent(i.to_string())),
Constant(NonIntegerConstant::Boolean(_)) Constant(NonIntegerConstant::Boolean(_))
| Constant(NonIntegerConstant::Float(_)) | Constant(NonIntegerConstant::Float(_))
| Constant(NonIntegerConstant::Text(_)) | Constant(NonIntegerConstant::Text(_))
@ -188,7 +188,7 @@ impl ConjoiningClauses {
.column_bindings .column_bindings
.get(&var) .get(&var)
.and_then(|cols| cols.first().map(|col| QueryValue::Column(col.clone()))) .and_then(|cols| cols.first().map(|col| QueryValue::Column(col.clone())))
.ok_or_else(|| AlgebrizerError::UnboundVariable(var.name()).into()), .ok_or_else(|| AlgebrizerError::UnboundVariable(var.name())),
}, },
EntidOrInteger(i) => Ok(QueryValue::PrimitiveLong(i)), EntidOrInteger(i) => Ok(QueryValue::PrimitiveLong(i)),
IdentOrKeyword(_) => unimplemented!(), // TODO IdentOrKeyword(_) => unimplemented!(), // TODO

View file

@ -122,7 +122,7 @@ impl ConjoiningClauses {
known.schema, known.schema,
transactions.clone(), transactions.clone(),
TransactionsColumn::Tx, TransactionsColumn::Tx,
tx_var.clone(), tx_var,
); );
let after_constraint = ColumnConstraint::Inequality { let after_constraint = ColumnConstraint::Inequality {
@ -138,7 +138,7 @@ impl ConjoiningClauses {
let before_constraint = ColumnConstraint::Inequality { let before_constraint = ColumnConstraint::Inequality {
operator: Inequality::LessThan, operator: Inequality::LessThan,
left: QueryValue::Column(QualifiedAlias( left: QueryValue::Column(QualifiedAlias(
transactions.clone(), transactions,
Column::Transactions(TransactionsColumn::Tx), Column::Transactions(TransactionsColumn::Tx),
)), )),
right: tx2, right: tx2,
@ -306,7 +306,7 @@ impl ConjoiningClauses {
self.bind_column_to_var( self.bind_column_to_var(
known.schema, known.schema,
transactions.clone(), transactions,
TransactionsColumn::Added, TransactionsColumn::Added,
var.clone(), var.clone(),
); );

View file

@ -312,7 +312,7 @@ pub fn algebrize_with_inputs(
cc.derive_types_from_find_spec(&parsed.find_spec); cc.derive_types_from_find_spec(&parsed.find_spec);
// Do we have a variable limit? If so, tell the CC that the var must be numeric. // Do we have a variable limit? If so, tell the CC that the var must be numeric.
if let &Limit::Variable(ref var) = &parsed.limit { if let Limit::Variable(ref var) = parsed.limit {
cc.constrain_var_to_long(var.clone()); cc.constrain_var_to_long(var.clone());
} }
@ -338,9 +338,9 @@ pub fn algebrize_with_inputs(
has_aggregates: false, // TODO: we don't parse them yet. has_aggregates: false, // TODO: we don't parse them yet.
with: parsed.with, with: parsed.with,
named_projection: extra_vars, named_projection: extra_vars,
order: order, order,
limit: limit, limit,
cc: cc, cc,
}; };
// Substitute in any fixed values and fail if they're out of range. // Substitute in any fixed values and fail if they're out of range.
@ -364,7 +364,7 @@ impl FindQuery {
in_vars: BTreeSet::default(), in_vars: BTreeSet::default(),
in_sources: BTreeSet::default(), in_sources: BTreeSet::default(),
limit: Limit::None, limit: Limit::None,
where_clauses: where_clauses, where_clauses,
order: None, order: None,
} }
} }
@ -417,5 +417,5 @@ impl FindQuery {
pub fn parse_find_string(string: &str) -> Result<FindQuery> { pub fn parse_find_string(string: &str) -> Result<FindQuery> {
parse_query(string) parse_query(string)
.map_err(|e| e.into()) .map_err(|e| e.into())
.and_then(|parsed| FindQuery::from_parsed_query(parsed)) .and_then(FindQuery::from_parsed_query)
} }

View file

@ -153,8 +153,8 @@ impl ColumnName for DatomsColumn {
impl ColumnName for VariableColumn { impl ColumnName for VariableColumn {
fn column_name(&self) -> String { fn column_name(&self) -> String {
match self { match self {
&VariableColumn::Variable(ref v) => v.to_string(), VariableColumn::Variable(ref v) => v.to_string(),
&VariableColumn::VariableTypeTag(ref v) => format!("{}_value_type_tag", v.as_str()), VariableColumn::VariableTypeTag(ref v) => format!("{}_value_type_tag", v.as_str()),
} }
} }
} }
@ -163,8 +163,8 @@ impl Debug for VariableColumn {
fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result { fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result {
match self { match self {
// These should agree with VariableColumn::column_name. // These should agree with VariableColumn::column_name.
&VariableColumn::Variable(ref v) => write!(f, "{}", v.as_str()), VariableColumn::Variable(ref v) => write!(f, "{}", v.as_str()),
&VariableColumn::VariableTypeTag(ref v) => write!(f, "{}_value_type_tag", v.as_str()), VariableColumn::VariableTypeTag(ref v) => write!(f, "{}_value_type_tag", v.as_str()),
} }
} }
} }
@ -178,10 +178,10 @@ impl Debug for DatomsColumn {
impl Debug for Column { impl Debug for Column {
fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result { fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result {
match self { match self {
&Column::Fixed(ref c) => c.fmt(f), Column::Fixed(ref c) => c.fmt(f),
&Column::Fulltext(ref c) => c.fmt(f), Column::Fulltext(ref c) => c.fmt(f),
&Column::Variable(ref v) => v.fmt(f), Column::Variable(ref v) => v.fmt(f),
&Column::Transactions(ref t) => t.fmt(f), Column::Transactions(ref t) => t.fmt(f),
} }
} }
} }
@ -298,10 +298,10 @@ impl Debug for QueryValue {
fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result { fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result {
use self::QueryValue::*; use self::QueryValue::*;
match self { match self {
&Column(ref qa) => write!(f, "{:?}", qa), Column(ref qa) => write!(f, "{:?}", qa),
&Entid(ref entid) => write!(f, "entity({:?})", entid), Entid(ref entid) => write!(f, "entity({:?})", entid),
&TypedValue(ref typed_value) => write!(f, "value({:?})", typed_value), TypedValue(ref typed_value) => write!(f, "value({:?})", typed_value),
&PrimitiveLong(value) => write!(f, "primitive({:?})", value), PrimitiveLong(value) => write!(f, "primitive({:?})", value),
} }
} }
} }
@ -375,15 +375,15 @@ impl Inequality {
} }
// The built-in inequality operators apply to Long, Double, and Instant. // The built-in inequality operators apply to Long, Double, and Instant.
pub fn supported_types(&self) -> ValueTypeSet { pub fn supported_types(self) -> ValueTypeSet {
use self::Inequality::*; use self::Inequality::*;
match self { match self {
&LessThan | &LessThanOrEquals | &GreaterThan | &GreaterThanOrEquals | &NotEquals => { LessThan | LessThanOrEquals | GreaterThan | GreaterThanOrEquals | NotEquals => {
let mut ts = ValueTypeSet::of_numeric_types(); let mut ts = ValueTypeSet::of_numeric_types();
ts.insert(ValueType::Instant); ts.insert(ValueType::Instant);
ts ts
} }
&Unpermute | &Differ | &TxAfter | &TxBefore => ValueTypeSet::of_one(ValueType::Ref), Unpermute | Differ | TxAfter | TxBefore => ValueTypeSet::of_one(ValueType::Ref),
} }
} }
} }
@ -392,17 +392,17 @@ impl Debug for Inequality {
fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result { fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result {
use self::Inequality::*; use self::Inequality::*;
f.write_str(match self { f.write_str(match self {
&LessThan => "<", LessThan => "<",
&LessThanOrEquals => "<=", LessThanOrEquals => "<=",
&GreaterThan => ">", GreaterThan => ">",
&GreaterThanOrEquals => ">=", GreaterThanOrEquals => ">=",
&NotEquals => "!=", // Datalog uses !=. SQL uses <>. NotEquals => "!=", // Datalog uses !=. SQL uses <>.
&Unpermute => "<", Unpermute => "<",
&Differ => "<>", Differ => "<>",
&TxAfter => ">", TxAfter => ">",
&TxBefore => "<", TxBefore => "<",
}) })
} }
} }
@ -534,17 +534,17 @@ impl Debug for ColumnConstraint {
fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result { fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result {
use self::ColumnConstraint::*; use self::ColumnConstraint::*;
match self { match self {
&Equals(ref qa1, ref thing) => write!(f, "{:?} = {:?}", qa1, thing), Equals(ref qa1, ref thing) => write!(f, "{:?} = {:?}", qa1, thing),
&Inequality { Inequality {
operator, operator,
ref left, ref left,
ref right, ref right,
} => write!(f, "{:?} {:?} {:?}", left, operator, right), } => write!(f, "{:?} {:?} {:?}", left, operator, right),
&Matches(ref qa, ref thing) => write!(f, "{:?} MATCHES {:?}", qa, thing), Matches(ref qa, ref thing) => write!(f, "{:?} MATCHES {:?}", qa, thing),
&HasTypes { HasTypes {
ref value, ref value,
ref value_types, ref value_types,
check_value, check_value,
@ -553,7 +553,7 @@ impl Debug for ColumnConstraint {
write!(f, "(")?; write!(f, "(")?;
for value_type in value_types.iter() { for value_type in value_types.iter() {
write!(f, "({:?}.value_type_tag = {:?}", value, value_type)?; write!(f, "({:?}.value_type_tag = {:?}", value, value_type)?;
if check_value && value_type == ValueType::Double if *check_value && value_type == ValueType::Double
|| value_type == ValueType::Long || value_type == ValueType::Long
{ {
write!( write!(
@ -573,7 +573,7 @@ impl Debug for ColumnConstraint {
} }
write!(f, "1)") write!(f, "1)")
} }
&NotExists(ref ct) => write!(f, "NOT EXISTS {:?}", ct), NotExists(ref ct) => write!(f, "NOT EXISTS {:?}", ct),
} }
} }
} }
@ -625,15 +625,15 @@ impl Debug for EmptyBecause {
fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result { fn fmt(&self, f: &mut Formatter) -> ::std::fmt::Result {
use self::EmptyBecause::*; use self::EmptyBecause::*;
match self { match self {
&CachedAttributeHasNoEntity { CachedAttributeHasNoEntity {
ref value, ref value,
ref attr, ref attr,
} => write!(f, "(?e, {}, {:?}, _) not present in store", attr, value), } => write!(f, "(?e, {}, {:?}, _) not present in store", attr, value),
&CachedAttributeHasNoValues { CachedAttributeHasNoValues {
ref entity, ref entity,
ref attr, ref attr,
} => write!(f, "({}, {}, ?v, _) not present in store", entity, attr), } => write!(f, "({}, {}, ?v, _) not present in store", entity, attr),
&ConflictingBindings { ConflictingBindings {
ref var, ref var,
ref existing, ref existing,
ref desired, ref desired,
@ -642,7 +642,7 @@ impl Debug for EmptyBecause {
"Var {:?} can't be {:?} because it's already bound to {:?}", "Var {:?} can't be {:?} because it's already bound to {:?}",
var, desired, existing var, desired, existing
), ),
&TypeMismatch { TypeMismatch {
ref var, ref var,
ref existing, ref existing,
ref desired, ref desired,
@ -651,7 +651,7 @@ impl Debug for EmptyBecause {
"Type mismatch: {:?} can't be {:?}, because it's already {:?}", "Type mismatch: {:?} can't be {:?}, because it's already {:?}",
var, desired, existing var, desired, existing
), ),
&KnownTypeMismatch { KnownTypeMismatch {
ref left, ref left,
ref right, ref right,
} => write!( } => write!(
@ -659,25 +659,25 @@ impl Debug for EmptyBecause {
"Type mismatch: {:?} can't be compared to {:?}", "Type mismatch: {:?} can't be compared to {:?}",
left, right left, right
), ),
&NoValidTypes(ref var) => write!(f, "Type mismatch: {:?} has no valid types", var), NoValidTypes(ref var) => write!(f, "Type mismatch: {:?} has no valid types", var),
&NonAttributeArgument => write!(f, "Non-attribute argument in attribute place"), NonAttributeArgument => write!(f, "Non-attribute argument in attribute place"),
&NonInstantArgument => write!(f, "Non-instant argument in instant place"), NonInstantArgument => write!(f, "Non-instant argument in instant place"),
&NonEntityArgument => write!(f, "Non-entity argument in entity place"), NonEntityArgument => write!(f, "Non-entity argument in entity place"),
&NonNumericArgument => write!(f, "Non-numeric argument in numeric place"), NonNumericArgument => write!(f, "Non-numeric argument in numeric place"),
&NonStringFulltextValue => write!(f, "Non-string argument for fulltext attribute"), NonStringFulltextValue => write!(f, "Non-string argument for fulltext attribute"),
&UnresolvedIdent(ref kw) => write!(f, "Couldn't resolve keyword {}", kw), UnresolvedIdent(ref kw) => write!(f, "Couldn't resolve keyword {}", kw),
&InvalidAttributeIdent(ref kw) => write!(f, "{} does not name an attribute", kw), InvalidAttributeIdent(ref kw) => write!(f, "{} does not name an attribute", kw),
&InvalidAttributeEntid(entid) => write!(f, "{} is not an attribute", entid), InvalidAttributeEntid(entid) => write!(f, "{} is not an attribute", entid),
&NonFulltextAttribute(entid) => write!(f, "{} is not a fulltext attribute", entid), NonFulltextAttribute(entid) => write!(f, "{} is not a fulltext attribute", entid),
&InvalidBinding(ref column, ref tv) => { InvalidBinding(ref column, ref tv) => {
write!(f, "{:?} cannot name column {:?}", tv, column) write!(f, "{:?} cannot name column {:?}", tv, column)
} }
&ValueTypeMismatch(value_type, ref typed_value) => write!( ValueTypeMismatch(value_type, ref typed_value) => write!(
f, f,
"Type mismatch: {:?} doesn't match attribute type {:?}", "Type mismatch: {:?} doesn't match attribute type {:?}",
typed_value, value_type typed_value, value_type
), ),
&AttributeLookupFailed => write!(f, "Attribute lookup failed"), AttributeLookupFailed => write!(f, "Attribute lookup failed"),
} }
} }
} }

View file

@ -158,8 +158,7 @@ pub struct SubvecIntoIterator<T> {
} }
impl<T> Iterator for SubvecIntoIterator<T> { impl<T> Iterator for SubvecIntoIterator<T> {
// TODO: this is a good opportunity to use `SmallVec` instead: most queries // TODO: this is a good opportunity to use `SmallVec` instead: most queries return a handful of columns.
// return a handful of columns.
type Item = Vec<T>; type Item = Vec<T>;
fn next(&mut self) -> Option<Self::Item> { fn next(&mut self) -> Option<Self::Item> {
let result: Vec<_> = (&mut self.values).take(self.width).collect(); let result: Vec<_> = (&mut self.values).take(self.width).collect();

View file

@ -73,6 +73,7 @@ impl QueryFragment for () {
} }
/// A QueryBuilder that implements SQLite's specific escaping rules. /// A QueryBuilder that implements SQLite's specific escaping rules.
#[derive(Default)]
pub struct SQLiteQueryBuilder { pub struct SQLiteQueryBuilder {
pub sql: String, pub sql: String,
@ -107,7 +108,7 @@ impl SQLiteQueryBuilder {
fn next_argument_name(&mut self) -> String { fn next_argument_name(&mut self) -> String {
let arg = format!("{}{}", self.arg_prefix, self.arg_counter); let arg = format!("{}{}", self.arg_prefix, self.arg_counter);
self.arg_counter = self.arg_counter + 1; self.arg_counter += 1;
arg arg
} }
@ -138,10 +139,10 @@ impl QueryBuilder for SQLiteQueryBuilder {
fn push_typed_value(&mut self, value: &TypedValue) -> BuildQueryResult { fn push_typed_value(&mut self, value: &TypedValue) -> BuildQueryResult {
use TypedValue::*; use TypedValue::*;
match value { match value {
&Ref(entid) => self.push_sql(entid.to_string().as_str()), Ref(entid) => self.push_sql(entid.to_string().as_str()),
&Boolean(v) => self.push_sql(if v { "1" } else { "0" }), Boolean(v) => self.push_sql(if *v { "1" } else { "0" }),
&Long(v) => self.push_sql(v.to_string().as_str()), Long(v) => self.push_sql(v.to_string().as_str()),
&Double(OrderedFloat(v)) => { Double(OrderedFloat(v)) => {
// Rust's floats print without a trailing '.' in some cases. // Rust's floats print without a trailing '.' in some cases.
// https://github.com/rust-lang/rust/issues/30967 // https://github.com/rust-lang/rust/issues/30967
// We format with 'e' -- scientific notation -- so that SQLite treats them as // We format with 'e' -- scientific notation -- so that SQLite treats them as
@ -149,10 +150,10 @@ impl QueryBuilder for SQLiteQueryBuilder {
// will currently (2017-06) always be 0, and need to round-trip as doubles. // will currently (2017-06) always be 0, and need to round-trip as doubles.
self.push_sql(format!("{:e}", v).as_str()); self.push_sql(format!("{:e}", v).as_str());
} }
&Instant(dt) => { Instant(dt) => {
self.push_sql(format!("{}", dt.to_micros()).as_str()); // TODO: argument instead? self.push_sql(format!("{}", dt.to_micros()).as_str()); // TODO: argument instead?
} }
&Uuid(ref u) => { Uuid(ref u) => {
let bytes = u.as_bytes(); let bytes = u.as_bytes();
if let Some(arg) = self.byte_args.get(bytes.as_ref()).cloned() { if let Some(arg) = self.byte_args.get(bytes.as_ref()).cloned() {
// Why, borrow checker, why?! // Why, borrow checker, why?!
@ -166,7 +167,7 @@ impl QueryBuilder for SQLiteQueryBuilder {
// These are both `Rc`. Unfortunately, we can't use that fact when // These are both `Rc`. Unfortunately, we can't use that fact when
// turning these into rusqlite Values. // turning these into rusqlite Values.
// However, we can check to see whether there's an existing var that matches… // However, we can check to see whether there's an existing var that matches…
&String(ref s) => { String(ref s) => {
if let Some(arg) = self.string_args.get(s).cloned() { if let Some(arg) = self.string_args.get(s).cloned() {
self.push_named_arg(arg.as_str()); self.push_named_arg(arg.as_str());
} else { } else {
@ -175,7 +176,7 @@ impl QueryBuilder for SQLiteQueryBuilder {
self.string_args.insert(s.clone(), arg); self.string_args.insert(s.clone(), arg);
} }
} }
&Keyword(ref s) => { Keyword(ref s) => {
// TODO: intern. // TODO: intern.
let v = Rc::new(rusqlite::types::Value::Text(s.as_ref().to_string())); let v = Rc::new(rusqlite::types::Value::Text(s.as_ref().to_string()));
self.push_static_arg(v); self.push_static_arg(v);
@ -233,7 +234,7 @@ impl QueryBuilder for SQLiteQueryBuilder {
args.sort_by(|&(ref k1, _), &(ref k2, _)| k1.cmp(k2)); args.sort_by(|&(ref k1, _), &(ref k2, _)| k1.cmp(k2));
SQLQuery { SQLQuery {
sql: self.sql, sql: self.sql,
args: args, args,
} }
} }
} }

View file

@ -240,7 +240,7 @@ impl Conn {
let tx = sqlite.transaction_with_behavior(behavior)?; let tx = sqlite.transaction_with_behavior(behavior)?;
let (current_generation, current_partition_map, current_schema, cache_cow) = { let (current_generation, current_partition_map, current_schema, cache_cow) = {
// The mutex is taken during this block. // The mutex is taken during this block.
let ref current: Metadata = *self.metadata.lock().unwrap(); let current: &Metadata = &(*self.metadata.lock().unwrap());
( (
current.generation, current.generation,
// Expensive, but the partition map is updated after every committed transaction. // Expensive, but the partition map is updated after every committed transaction.
@ -367,7 +367,7 @@ impl Conn {
.register(key, observer); .register(key, observer);
} }
pub fn unregister_observer(&mut self, key: &String) { pub fn unregister_observer(&mut self, key: &str) {
self.tx_observer_service.lock().unwrap().deregister(key); self.tx_observer_service.lock().unwrap().deregister(key);
} }
} }

View file

@ -52,7 +52,7 @@ impl Store {
let mut connection = crate::new_connection(path)?; let mut connection = crate::new_connection(path)?;
let conn = Conn::connect(&mut connection)?; let conn = Conn::connect(&mut connection)?;
Ok(Store { Ok(Store {
conn: conn, conn,
sqlite: connection, sqlite: connection,
}) })
} }
@ -100,7 +100,7 @@ impl Store {
let mut connection = crate::new_connection_with_key(path, encryption_key)?; let mut connection = crate::new_connection_with_key(path, encryption_key)?;
let conn = Conn::connect(&mut connection)?; let conn = Conn::connect(&mut connection)?;
Ok(Store { Ok(Store {
conn: conn, conn,
sqlite: connection, sqlite: connection,
}) })
} }

View file

@ -234,7 +234,7 @@ impl Definition {
{ {
Definition { Definition {
name: name.into(), name: name.into(),
version: version, version,
attributes: attributes.into(), attributes: attributes.into(),
pre: Definition::no_op, pre: Definition::no_op,
post: Definition::no_op, post: Definition::no_op,
@ -730,8 +730,8 @@ impl<'a, 'c> VersionedStore for InProgress<'a, 'c> {
} }
c @ VocabularyCheck::NotPresent c @ VocabularyCheck::NotPresent
| c @ VocabularyCheck::PresentButNeedsUpdate { older_version: _ } | c @ VocabularyCheck::PresentButNeedsUpdate { .. }
| c @ VocabularyCheck::PresentButMissingAttributes { attributes: _ } => { | c @ VocabularyCheck::PresentButMissingAttributes { .. } => {
work.add(definition, c); work.add(definition, c);
} }
} }
@ -758,8 +758,7 @@ impl<'a, 'c> VersionedStore for InProgress<'a, 'c> {
// Save this: we'll do it later. // Save this: we'll do it later.
missing.push((definition, attributes)); missing.push((definition, attributes));
} }
VocabularyCheck::Present VocabularyCheck::Present | VocabularyCheck::PresentButTooNew { .. } => {
| VocabularyCheck::PresentButTooNew { newer_version: _ } => {
unreachable!(); unreachable!();
} }
} }
@ -815,9 +814,9 @@ impl SimpleVocabularySource {
post: Option<fn(&mut InProgress<'_, '_>) -> Result<()>>, post: Option<fn(&mut InProgress<'_, '_>) -> Result<()>>,
) -> SimpleVocabularySource { ) -> SimpleVocabularySource {
SimpleVocabularySource { SimpleVocabularySource {
pre: pre, pre,
post: post, post,
definitions: definitions, definitions,
} }
} }
@ -910,8 +909,8 @@ where
.collect(); .collect();
Ok(Some(Vocabulary { Ok(Some(Vocabulary {
entity: entid.into(), entity: entid.into(),
version: version, version,
attributes: attributes, attributes,
})) }))
} }
Some(_) => bail!(MentatError::InvalidVocabularyVersion), Some(_) => bail!(MentatError::InvalidVocabularyVersion),
@ -982,7 +981,7 @@ where
name.clone(), name.clone(),
Vocabulary { Vocabulary {
entity: vocab, entity: vocab,
version: version, version,
attributes: attrs, attributes: attrs,
}, },
) )