admin管理员组

文章数量:1122826

I have VB6 code, which cannot be changed, calling functions from an external dll. I am replacing the external dll with a C++ dll which I'm writing, which will call a .NET dll which I'm writing, via COM. I have control over the C++ and C#.

VB6 calls the dll functions like so:

Declare Sub IEsend _
    Lib "IEEE_32M1.DLL" _
    Alias "_ieee_send@16" (ByVal addr As Long, _
                           ByVal s As String, _
                           ByVal l As Long, _
                           status As Long)

C# exposing functions via COM for C++ to call

namespace IEEE_32MCOM
{
    [ComVisible(true)]
    [InterfaceType(ComInterfaceType.InterfaceIsDual)]
    [Guid("41C9877D-C110-4CBC-9E1B-4507D220DBCD")]
    public interface IIEEE_32MCOM
    {
        void _ieee_sendCOM(int addr, string s, int l, ref int status);
    }

    [ComVisible(true)]
    [ClassInterface(ClassInterfaceType.AutoDual)]
    [Guid("FE506B77-AD6B-4C37-97E2-A7A5DDE0CA52")]
    public class IEEE_32MCOMClass : IIEEE_32MCOM
    {
        public void _ieee_sendCOM(int addr, string s, int l, ref int status)
        {
            // .NET stuff here
        }
    }
}

C++ header

#pragma once
#include <string>
#import "IEEE_32MCOM.tlb" raw_interfaces_only, named_guids

extern "C"
{
#pragma comment(linker, "/EXPORT:ieee_send=_ieee_send@16")
    __declspec(dllexport) void __stdcall ieee_send(long addr, BSTR cmd, long l, long* status);
}

IEEE_32MCOM::IIEEE_32MCOM* myInterface;

void init();

C++ code file

#pragma comment(lib, "IEEE_32MCOM.tlb")
#include "pch.h"
#include "IEEE_32M1.h"

extern "C" __declspec(dllimport) void _ieee_sendCOM(long addr, BSTR cmd, long l, long* status);

void __stdcall ieee_send(long addr, BSTR cmd, long l, long* status)
{
    init();
    myInterface->_ieee_sendCOM(addr, cmd, l, status);
}

void init()
{
    HRESULT hr = CoCreateInstance(
        __uuidof(IEEE_32MCOM::IEEE_32MCOMClass),
        NULL,
        CLSCTX_ALL,
        IEEE_32MCOM::IID_IIEEE_32MCOM,
        (void**)&myInterface);
    if (FAILED(hr))
    {
        _com_error e(hr);
    }
}

The call chain is working after much research - I am not fluent in C++.

Except the issue is that strings being passed from VB6 seem to 8-bit characters, and the BSTR in C++ is rendering as 16-bit characters.

Passing the 9 character string "ABCDEFGHI",

C++ sees 䉁䑃䙅䡇I

C# sees 䉁䑃䙅䡇

I don't mind converting to a human readable string in C#, actually I prefer it. But as can be seen, the last character is missing from the string in C#

Inspecting the bytes, var bytes = Encoding.Unicode.GetBytes(s); yields 8 bytes which include the first 8 characters, 65, 66, ..., 72, missing the final char 73, or "I".

I believe I am restricted to BSTR for COM, but it's cutting off a character when the number of characters is odd. How do I pass the entire string to .NET? I would prefer the entire human readable string get passed from C++ to C# in the first place but any way I can make this work is fine.

I have VB6 code, which cannot be changed, calling functions from an external dll. I am replacing the external dll with a C++ dll which I'm writing, which will call a .NET dll which I'm writing, via COM. I have control over the C++ and C#.

VB6 calls the dll functions like so:

Declare Sub IEsend _
    Lib "IEEE_32M1.DLL" _
    Alias "_ieee_send@16" (ByVal addr As Long, _
                           ByVal s As String, _
                           ByVal l As Long, _
                           status As Long)

C# exposing functions via COM for C++ to call

namespace IEEE_32MCOM
{
    [ComVisible(true)]
    [InterfaceType(ComInterfaceType.InterfaceIsDual)]
    [Guid("41C9877D-C110-4CBC-9E1B-4507D220DBCD")]
    public interface IIEEE_32MCOM
    {
        void _ieee_sendCOM(int addr, string s, int l, ref int status);
    }

    [ComVisible(true)]
    [ClassInterface(ClassInterfaceType.AutoDual)]
    [Guid("FE506B77-AD6B-4C37-97E2-A7A5DDE0CA52")]
    public class IEEE_32MCOMClass : IIEEE_32MCOM
    {
        public void _ieee_sendCOM(int addr, string s, int l, ref int status)
        {
            // .NET stuff here
        }
    }
}

C++ header

#pragma once
#include <string>
#import "IEEE_32MCOM.tlb" raw_interfaces_only, named_guids

extern "C"
{
#pragma comment(linker, "/EXPORT:ieee_send=_ieee_send@16")
    __declspec(dllexport) void __stdcall ieee_send(long addr, BSTR cmd, long l, long* status);
}

IEEE_32MCOM::IIEEE_32MCOM* myInterface;

void init();

C++ code file

#pragma comment(lib, "IEEE_32MCOM.tlb")
#include "pch.h"
#include "IEEE_32M1.h"

extern "C" __declspec(dllimport) void _ieee_sendCOM(long addr, BSTR cmd, long l, long* status);

void __stdcall ieee_send(long addr, BSTR cmd, long l, long* status)
{
    init();
    myInterface->_ieee_sendCOM(addr, cmd, l, status);
}

void init()
{
    HRESULT hr = CoCreateInstance(
        __uuidof(IEEE_32MCOM::IEEE_32MCOMClass),
        NULL,
        CLSCTX_ALL,
        IEEE_32MCOM::IID_IIEEE_32MCOM,
        (void**)&myInterface);
    if (FAILED(hr))
    {
        _com_error e(hr);
    }
}

The call chain is working after much research - I am not fluent in C++.

Except the issue is that strings being passed from VB6 seem to 8-bit characters, and the BSTR in C++ is rendering as 16-bit characters.

Passing the 9 character string "ABCDEFGHI",

C++ sees 䉁䑃䙅䡇I

C# sees 䉁䑃䙅䡇

I don't mind converting to a human readable string in C#, actually I prefer it. But as can be seen, the last character is missing from the string in C#

Inspecting the bytes, var bytes = Encoding.Unicode.GetBytes(s); yields 8 bytes which include the first 8 characters, 65, 66, ..., 72, missing the final char 73, or "I".

I believe I am restricted to BSTR for COM, but it's cutting off a character when the number of characters is odd. How do I pass the entire string to .NET? I would prefer the entire human readable string get passed from C++ to C# in the first place but any way I can make this work is fine.

Share Improve this question asked Nov 21, 2024 at 19:21 djvdjv 15.7k7 gold badges50 silver badges78 bronze badges 8
  • Show how you are creating tand passing around the BSTR it likely starts there. "I am replacing the external dll with a C++ dll which I'm writing, which will call a .NET dll which I'm writing" why not just create a C# dll and call it directly from VB6? – Charlieface Commented Nov 21, 2024 at 19:43
  • @Charlieface because of the calling convention in VB6 using the stdcall decorations. I can't modify the vb6 code. Otherwise if I could I would use COM there – djv Commented Nov 21, 2024 at 19:50
  • Do you have a small reproducing project – Simon Mourier Commented Nov 21, 2024 at 19:50
  • C# can use StdCall, why would you think otherwise? COM works fine from VB6 to C#. And why does this need COM, why not just use DllExport github.com/3F/DllExport – Charlieface Commented Nov 21, 2024 at 19:51
  • @Charlieface still learning. I haven't needed to do something like this before. I know COM works from vb6 to C#, but I need import a function as the VB6 is doing, and all my research pointed to no, a .NET assembly won't have a suitable entrypoint to be called that way. Maybe the github project you linked handles everything – djv Commented Nov 21, 2024 at 19:58
 |  Show 3 more comments

1 Answer 1

Reset to default 6

Except the issue is that strings being passed from VB6 seem to 8-bit characters, and the BSTR in C++ is rendering as 16-bit characters.

That is exactly what is happening. So, you need to change your DLL to not accept the string as a BSTR, but rather as a char*, and then convert it to BSTR/wchar_t* when passing it to C#.

For example:

void __stdcall ieee_send(long addr, const char* cmd, long l, long* status)
{
    init();
    myInterface->_ieee_sendCOM(addr, _bstr_t(cmd), l, status);
}

本文标签: Passing string from VB6 to c to Cstring is cut offStack Overflow