This removes the convert-from-arithemtic-scalar constructor of
any_container as it can result in ambiguous calls, as in:
py::array_t<float>({ 1, 2 })
which could be intepreted as either of:
py::array_t<float>(py::array_t<float>(1, 2))
py::array_t<float>(py::detail::any_container({ 1, 2 }))
Removing the convert-from-arithmetic constructor reduces the number of
implicit conversions, avoiding the ambiguity for array and array_t.
This also re-adds the array/array_t constructors taking a scalar
argument for backwards compatibility.
The job is using the released clang and stable-branch libc++, which
wasn't the case when it was added. Leave the g++7/c++17 in
allow_failures for now as it's still a pre-release compiler (and pulled
from debian experimental).
Python 3's `PyInstanceMethod_Type` hides itself via its `tp_descr_get`,
which prevents aliasing methods via `cls.attr("m2") = cls.attr("m1")`:
instead the `tp_descr_get` returns a plain function, when called on a
class, or a `PyMethod`, when called on an instance. Override that
behaviour for pybind11 types with a special bypass for
`PyInstanceMethod_Types`.
The Unicode support added in 2.1 (PR #624) inadvertently broke accepting
`bytes` as std::string/char* arguments. This restores it with a
separate path that does a plain conversion (i.e. completely bypassing
all the encoding/decoding code), but only for single-byte string types.
The numpy API constants can check past the end of the API array if the
numpy version is too old thus causing a segfault. The current list of
functions requires numpy >= 1.7.0, so this adds a check and exception if
numpy is too old.
The added feature version API element was added in numpy 1.4.0, so this
could still segfault if loaded in 1.3.0 or earlier, but given that
1.4.0 was released at the end of 2009, it seems reasonable enough to
not worry about that case. (1.7.0 was released in early 2013).
This commits adds base class pointers of offset base classes (i.e. due
to multiple inheritance) to `registered_instances` so that if such a
pointer is returned we properly recognize it as an existing instance.
Without this, returning a base class pointer will cast to the existing
instance if the pointer happens to coincide with the instance pointer,
but constructs a new instance (quite possibly with a segfault, if
ownership is applied) for unequal base class pointers due to multiple
inheritance.
When we are returned a base class pointer (either directly or via
shared_from_this()) we detect its runtime type (using `typeid`), then
end up essentially reinterpret_casting the pointer to the derived type.
This is invalid when the base class pointer was a non-first base, and we
end up with an invalid pointer. We could dynamic_cast to the
most-derived type, but if *that* type isn't pybind11-registered, the
resulting pointer given to the base `cast` implementation isn't necessarily valid
to be reinterpret_cast'ed back to the backup type.
This commit removes the "backup" type argument from the many-argument
`cast(...)` and instead does the derived-or-pointer type decision and
type lookup in type_caster_base, where the dynamic_cast has to be to
correctly get the derived pointer, but also has to do the type lookup to
ensure that we don't pass the wrong (derived) pointer when the backup
type (i.e. the type caster intrinsic type) pointer is needed.
Since the lookup is needed before calling the base cast(), this also
changes the input type to a detail::type_info rather than doing a
(second) lookup in cast().
This breaks up the instance management functions in class_support.h a
little bit so that other pybind11 code can use it. In particular:
- added make_new_instance() which does what pybind11_object_new does,
but also allows instance allocation without `value` allocation. This
lets `cast.h` use the same instance allocation rather than having its
own separate implementation.
- instance registration is now moved to a
`register_instance()`/deregister_instance()` pair (rather than having
individual code add or remove things from `registered_instances`
directory).
- clear_instance() does everything `pybind11_object_dealloc()` needs
except for the deallocation; this is helpful for factory construction
which needs to be able to replace the internals of an instance without
deallocating it.
- clear_instance() now also calls `dealloc` when `holder_constructed`
is true, even if `value` is false. This can happen in factory
construction when the pointer is moved from one instance to another,
but the holder itself is only copied (i.e. for a shared_ptr holder).
I got some unexpected errors from code using `overload_cast` until I
realized that I'd configured the build with -std=c++11.
This commit adds a fake `overload_cast` class in C++11 mode that
triggers a static_assert failure indicating that C++14 is needed.
We currently fail at runtime when trying to call a method that is
overloaded with both static and non-static methods. This is something
python won't allow: the object is either a function or an instance, and
can't be both.
Adding numpy to the pypy test exposed a segfault caused by the buffer
tests in test_stl_binders.py: the first such test was explicitly skipped
on pypy, but the second (test_vector_buffer_numpy) which also seems to
cause an occasional segfault was just marked as requiring numpy.
Explicitly skip it on pypy as well (until a workaround, fix, or pypy fix
are found).
Various bash variables that are only used in the travis-ci script and
don't need to propagate (e.g. to cmake) are being pointlessly exported;
this removes these `export`s.
This uses the trusty container rather than docker for the clang 4.0
build. It also caches the local libc++ installation so that it doesn't
need to be compiled every time, which should speed up the job
considerably.
This applies several changes to the non-docker travis-ci builds:
- Make all builds use trusty rather than precise. pybind can't really
build in precise anyway (we install essentially the entire toolchain
backported from trusty on every build), and so this saves needing to
install all the backported packages during the build setup.
- Updated the 3.5 build to 3.6 (via deadsnakes, which didn't backport
3.6 to ubuntu releases earlier than trusty).
- As a result of the switch to trusty, the BAREBONES build now picks up
the (default installed) python 3.5 installation.
- Invoke pip everywhere via $PYTHON -m pip rather than the pip
executable, which saves us having to figure out what the pip
executable is, and ensures that we are using the correct pip.
- Install packages with `pip --user` rather than in a virtualenv.
- Add the local user python package archive to the travis-ci cache
(rather than the pip cache). This saves needing to install packages
during installation (unless there are updates, in which case the
package and the cache are updated).
- Install numpy and scipy on the pypy build. This has to build from
source (and so blas and fortran need to be installed on the build),
but given the above caching, the build will only be slow for the first
build after a new numpy/scipy release. This testing is valuable:
numpy has various behaviour differences under pypy.
- Added set -e/+e around the before_install/install blocks so that a
failure here (e.g. a pip install failure or dependency download
failure) triggers a build failure.
- Update eigen version to latest (3.3.3), mainly to be consistent with
the appveyor build.
- The travis trusty environment has an upgraded cmake, so this
downgrades cmake (to the stock trusty version) on the first couple
jobs so that we're still including some cmake 2.8.12 testing.
Don't try to define these in the issues submodule, because that fails
if testing without issues compiled in (e.g. using
cmake -DPYBIND11_TEST_OVERRIDE=test_methods_and_attributes.cpp).
This further reduces the constructors required in buffer_info/numpy by
removing the need for the constructors that take a single size_t and
just forward it on via an initializer_list to the container-accepting
constructor.
Unfortunately, in `array` one of the constructors runs into an ambiguity
problem with the deprecated `array(handle, bool)` constructor (because
both the bool constructor and the any_container constructor involve an
implicit conversion, so neither has precedence), so a forwarding
constructor is kept there (until the deprecated constructor is
eventually removed).
This adds support for constructing `buffer_info` and `array`s using
arbitrary containers or iterator pairs instead of requiring a vector.
This is primarily needed by PR #782 (which makes strides signed to
properly support negative strides, and will likely also make shape and
itemsize to avoid mixed integer issues), but also needs to preserve
backwards compatibility with 2.1 and earlier which accepts the strides
parameter as a vector of size_t's.
Rather than adding nearly duplicate constructors for each stride-taking
constructor, it seems nicer to simply allow any type of container (or
iterator pairs). This works by replacing the existing vector arguments
with a new `detail::any_container` class that handles implicit
conversion of arbitrary containers into a vector of the desired type.
It can also be explicitly instantiated with a pair of iterators (e.g.
by passing {begin, end} instead of the container).
Upcoming changes to buffer_info make it need some things declared in
common.h; it also feels a bit misplaced in common.h (which is arguably
too large already), so move it out. (Separating this and the subsequent
changes into separate commits to make the changes easier to distinguish
from the move.)
When attempting to get a raw array pointer we return nullptr if given a
nullptr, which triggers an error_already_set(), but we haven't set an
exception message, which results in "Unknown internal error".
Callers that want explicit allowing of a nullptr here already handle it
(by clearing the exception after the call).
When processing many files that contain top-level items with the same
name (e.g. "operator<<"), the output was non-deterministic and depended
on the order in which the different Clang processes finished. This
commit adds sorting that also accounts for the filename to prevent
random changes from run to run.
Many of the Eigen type casters' name() methods weren't wrapping the type
description in a `type_descr` object, which thus wasn't adding the
"{...}" annotation used to identify an argument which broke the help
output by skipping eigen arguments.
The test code I had added even had some (unnoticed) broken output (with
the "arg0: " showing up in the return value).
This commit also adds test code to ensure that named eigen arguments
actually work properly, despite the invalid help output. (The added
tests pass without the rest of this commit).
The holder casters assume but don't check that a `holder<type>`'s `type`
is really a `type_caster_base<type>`; this adds a static_assert to make
sure this is really the case, to turn things like
`std::shared_ptr<array>` into a compilation failure.
Fixes#785
Fixes#775.
Assignments of the form `Type.static_prop = value` should be translated to
`Type.static_prop.__set__(value)` except when `isinstance(value, static_prop)`.
PR #771 deprecated them as they can cause linking failures (#770), but
the deprecation tags cause warnings on GCC 5.x through 6.2.x. Removing
them entirely will break backwards-compatibility consequences, but the
effects should be minimal (only code that was inheriting from `object`
could get at them at all as they are protected).
Fixes#777
When make_tuple fails (for example, when print() is called with a
non-convertible argument, as in #778) the error message a less helpful
than it could be:
make_tuple(): unable to convert arguments of types 'std::tuple<type1, type2>' to Python object
There is no actual std::tuple involved (only a parameter pack and a
Python tuple), but it also doesn't immediately reveal which type caused
the problem.
This commit changes the debugging mode output to show just the
problematic type:
make_tuple(): unable to convert argument of type 'type2' to Python object
This commit adds `error_already_set::matches()` convenience method to
check if the exception trapped by `error_already_set` matches a given
Python exception type. This will address #700 by providing a less
verbose way to check exceptions.
The constexpr static instances can cause linking failures if the
compiler doesn't optimize away the reference, as reported in #770.
There's no particularly nice way of fixing this in C++11/14: we can't
inline definitions to match the declaration aren't permitted for
non-templated static variables (C++17 *does* allows "inline" on
variables, but that obviously doesn't help us.)
One solution that could work around it is to add an extra inherited
subclass to `object`'s hierarchy, but that's a bit of a messy solution
and was decided against in #771 in favour of just deprecating (and
eventually dropping) the constexpr statics.
Fixes#770.
* Arch-indep CMake packaging
Since pybind11 is a header-only library, the CMake packaging does not have to carry any architecture specific checks. Without this patch, the detection of pybind11 will fail on 32-bit architectures if the project was built on a 64-bit machine and vice-versa. This fix is similar to what is applied to `Eigen` and other header-only C++ libraries.