I try to make TelnetClient.I use FileIO for read Telnet.There is no problem read or write and also create a string list but I need to show QStringList to ListView but I m getting error: "m_model is not defined".
I create QStringList:
QStringList FileIO::read() {
if (m_source.isEmpty()) {
emit error("source is empty");
return QStringList();
}
QFile file(m_source);
QString fileContent;
QString line;
QStringList list;
if ( file.open(QIODevice::ReadWrite) ) {
QTextStream t( &file );
line = t.readAll();
fileContent += line;
list.append(line.split("\r\n"));
foreach (QString item, list) {
if (item[0].isNumber()) {
list2.append(item);
}
}
QQmlContext *ctxt;
ctxt->setContextProperty("m_model", QVariant::fromValue(list2));
qDebug() << "\r\n\r\nlist2 =" << list2;
line = t.readAll();
qDebug() << "SOURCE" << m_source;
file.close();
}
else {
emit error("Unable to open the file");
return QStringList();
}
return list2;
This can make a new QStringList successfully and also I assign my string list as a model; m_model.
ListView {
id: listView1
x: 0
y: 0
model: m_model
delegate: Rectangle{
Text {text: modelData }
}
}
and here is my ListView. When I try like this, I m getting error. How can I solve this problem. If I can use "list2" in main.cpp I can solve the problem but I don't know how can I use it in main.cpp because it exist in another class.
Thank you!
You can try to set the context property with an instance of the class. That way, you can instantiate the class in main, and then pass it's address to set the context property. If the data of the model is subject to change while the program is running, I would suggest implementing the QStringList as a Q_Property.
//main.cpp
FileIO fileIO;
QQmlApplicationEngine engine;
QQmlContext* ctx = engine.rootContext();
ctx->setContextProperty("fileio", &fileIO);
engine.load(/* Path to your qml */);
//qml
ListView {
id: listView1
x: 0
y: 0
model: fileio.m_model
delegate: Rectangle{
Text {text: modelData }
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With